1

I'm working on bringing my memory down when using the cursor()method when fetching 3M+ records in Laravel 6.x.

I have a artisan command that runs the following code:

Product::cursor()->each(function ($product) {
    calculateStats::dispatch($product);
});

According to the documentation, it should result in low memory usage, but the memory just keeps crawling up until it hits my 2G limit.

I read it could be because the PDO connection uses a buffered query, so i tried to add this before running the query:

\DB::connection()->getPdo()->setAttribute(\PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false);

from here: https://github.com/laravel/framework/issues/14919

This resulted in a lower start memory usage, but the result is still the same. I also tried to add it to the database.php config file, but same result.

I'm using MySQL 8, if that makes any difference.

I use the dispatch() method to distribute it to the workers, as it would take to long to process them one at a time.

Any good suggestions on how to solve the memory issue?

EDIT

After som additional debugging i seems like it is the job dispatching that hugs the memory. How can the job be released from memory, as it has already been sent to the Redis-queue?

FIY: the setup uses Redis and Horizon.

  • Does this answer your question? [Optimizing code with chunk or cursor in laravel](https://stackoverflow.com/questions/45467100/optimizing-code-with-chunk-or-cursor-in-laravel) – BadPiggie Nov 17 '19 at 04:36
  • I'm not sure you can use the `->each()` method on a generator, could you try rewriting to a foreach loop and see if the issue still happens? – PtrTon Nov 17 '19 at 09:44
  • i have tried with the foreach loop aswell, memory-wise is it the same result. – René Juul Askjær Nov 17 '19 at 19:21
  • 1
    Regarding the link @BanujanBalendrakumar, when you use the chunk method on really big tables, it has a huge performance hit because the query does offset x, limit y. it takes longer and longer for each "chunk". The first 100k results is not a problem, but when you hit 1M+, the query is really slow. – René Juul Askjær Nov 17 '19 at 19:21
  • I just ran into this today as well. Chunking issues aside, do `for ($i = 0; $i < 100000; $i++) { EmptyJob::dispatch(); }` and you'll also see the memory slowly climb. – Watercycle Jul 01 '20 at 05:47

2 Answers2

4

In my case, Laravel's Telescope plugin was holding onto a reference to the dispatched job, causing a very noticeable memory leak when ran with 10,000+ jobs.

Watercycle
  • 497
  • 1
  • 7
  • 12
0

Instead, "remember where you left off": http://mysql.rjweb.org/doc.php/pagination

Sorry, I doubt if this can be expressed directly in Laravel; you may have to drop into SQL to do the task. Consider filing a feature request for Laravel.

Rick James
  • 135,179
  • 13
  • 127
  • 222