I'm working on bringing my memory down when using the cursor()
method when fetching 3M+ records in Laravel 6.x.
I have a artisan command that runs the following code:
Product::cursor()->each(function ($product) {
calculateStats::dispatch($product);
});
According to the documentation, it should result in low memory usage, but the memory just keeps crawling up until it hits my 2G limit.
I read it could be because the PDO connection uses a buffered query, so i tried to add this before running the query:
\DB::connection()->getPdo()->setAttribute(\PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false);
from here: https://github.com/laravel/framework/issues/14919
This resulted in a lower start memory usage, but the result is still the same.
I also tried to add it to the database.php
config file, but same result.
I'm using MySQL 8, if that makes any difference.
I use the dispatch()
method to distribute it to the workers, as it would take to long to process them one at a time.
Any good suggestions on how to solve the memory issue?
EDIT
After som additional debugging i seems like it is the job dispatching that hugs the memory. How can the job be released from memory, as it has already been sent to the Redis-queue?
FIY: the setup uses Redis and Horizon.