* fix: reduce bulk insert batch size Back when this feature was added it used to lazily evaluate the input. Now the iterator is consumed upfront so large batch sizes == huge memory usage. * perf: bring back iterator for bulk_insert Bulk insert used to support iterator for consuming arbitrarily large amount of data and inserting it. Since child table support was added, it can't do it anymore because that requires collecting values. This change now brings back iterators by batching input iterator (by default 1000) documents. This is almost as good as original change from design POV. Performance is still meh for flat documents. |
||
|---|---|---|
| .. | ||
| mariadb | ||
| postgres | ||
| sqlite | ||
| __init__.py | ||
| database.py | ||
| db_manager.py | ||
| operator_map.py | ||
| query.py | ||
| schema.py | ||
| sequence.py | ||
| utils.py | ||