I'm having Company
Model and Contact
Model defined in my Laravel 5.4
application, both of them have many to many relationship. So, for example contacts model has:
public function company()
{
return $this
->belongsToMany('App\Company', 'company_contact','contact_id', 'company_id')->withTimestamps();
}
Now I've a data set where I want to pull all contacts data and there company details so I was using:
public function getData()
{
$contacts = Contact::all();
foreach($contacts as $contact)
{
$getCompany = $contact->company()->withPivot('created_at')->orderBy('pivot_created_at', 'desc')->first();
$getCompany->contacts = Company::find($getCompany->id)->contacts;
$contact->company = $getCompany;
$contact->companies_interested = json_decode($contact->companies_interested);
$companies = [];
if($contact->companies_interested)
{
foreach($contact->companies_interested as $companiesInterested)
{
$getCompany = Company::withTrashed()->find($companiesInterested);
$companies[] = array(
'value' => $getCompany->id,
'label' => $getCompany->name
);
}
$contact->companies_interested = json_encode($companies);
}
}
return response()->json([
'model' => $contacts
], 200);
}
This works perfectly fine for small data set, but while using large number of data it fails (approx 10,000 fields), I guess php memory fails
to load when it comes to large data set. I was going through Laravel
docs to find out the solution and came to know about chunk()
and cursor()
methods, Can someone guide me what can be done in this problem or what can be the approach to overcome this.
Thanks