4

I'm having Company Model and Contact Model defined in my Laravel 5.4 application, both of them have many to many relationship. So, for example contacts model has:

public function company()
{
    return $this
        ->belongsToMany('App\Company', 'company_contact','contact_id', 'company_id')->withTimestamps();
}

Now I've a data set where I want to pull all contacts data and there company details so I was using:

public function getData()
{
    $contacts = Contact::all();
    foreach($contacts as $contact)
    {
        $getCompany = $contact->company()->withPivot('created_at')->orderBy('pivot_created_at', 'desc')->first();
        $getCompany->contacts = Company::find($getCompany->id)->contacts;
        $contact->company = $getCompany;
        $contact->companies_interested = json_decode($contact->companies_interested);
        $companies = [];
        if($contact->companies_interested)
        {
            foreach($contact->companies_interested as $companiesInterested)
            {
                $getCompany = Company::withTrashed()->find($companiesInterested);
                $companies[] = array(
                    'value' => $getCompany->id,
                    'label' => $getCompany->name
                );
            }
            $contact->companies_interested = json_encode($companies);
        }
    }
    return response()->json([
        'model' => $contacts
    ], 200);
}

This works perfectly fine for small data set, but while using large number of data it fails (approx 10,000 fields), I guess php memory fails to load when it comes to large data set. I was going through Laravel docs to find out the solution and came to know about chunk() and cursor() methods, Can someone guide me what can be done in this problem or what can be the approach to overcome this.

Thanks

Nitish Kumar
  • 6,054
  • 21
  • 82
  • 148
  • 2
    Are you familar with Chris Fidao's "Laravel Performant" series? He has a video for Database Chuncking in Laravel that explains exactly what you are trying to implement. It's also free. https://serversforhackers.com/laravel-perf/database-chunking – Victor Aug 02 '17 at 17:48

1 Answers1

5

I recommend you to test both methods for some quirkiness of your system.

Chunk:

It will "paginate" your query, this way you use less memory.

  • Uses less memory
  • It takes longer

`

public function getData() {
    Contact::chunk(1000, function ($contacts) {
        foreach ($contacts as $contact) {
            //rest of your code...
        }
    });
}

`

Cursor:

You will use PHP Generators to search your query items one by one.

  • It takes less time
  • Uses more memory

`

public function getData() {
    foreach (Contact::cursor() as $contact) {
        //rest of your code...
    }
}

`

For a more detailed explanation see this answer: What is the difference between laravel cursor and laravel chunk method?

For performance testing see this post: https://translate.google.com/translate?hl=en&sl=auto&tl=en&u=http%3A%2F%2Fqiita.com%2Fryo511%2Fitems%2Febcd1c1b2ad5addc5c9d

Jean Marcos
  • 1,167
  • 1
  • 12
  • 17
  • I don't want data in chunks, I want all the data present in my database and running 2 more query each data in foreach loop, so I used `cursor()` but it ain't helping me out. – Nitish Kumar Aug 03 '17 at 08:45
  • The link which you shared, I had already gone through it, as I mentioned in question I've already done background check. – Nitish Kumar Aug 03 '17 at 08:47
  • i come from c#: doesn't cursoring (linq'ing) the list mean that after you are finished with the item in the cursor it is disposed and thus freeing the memory again? – Joel Harkes Feb 21 '19 at 14:10
  • 1
    I donot think cursor is faster than chunk because cursor will take lot of time in roundtrip – Veshraj Joshi Jul 10 '20 at 12:53