If you're reading 1000s of rows from MySQL, the default behaviour is to read all of them in memory at once. One of the use case for reading large rows is reporting where a lot of data is read and then processed in Python. The read row is hoever not used again but still consumes memory until entire function exits. SSCursor (Server Side Cursor) allows fetching one row at a time. Note: This is slower than fetching everything at once AND has risk of connection loss. So, don't use this as a crutch. If possible rewrite code so processing is done in SQL. |
||
|---|---|---|
| .. | ||
| mariadb | ||
| postgres | ||
| __init__.py | ||
| database.py | ||
| db_manager.py | ||
| operator_map.py | ||
| query.py | ||
| schema.py | ||
| sequence.py | ||
| utils.py | ||