Home > Enterprise >  Laravel eloquent query slow with 100000 records
Laravel eloquent query slow with 100000 records

Time:03-12

I have this simple eloquent query, works fine, but with few records.
When database increments until 100000 records become very slow.

I read should be use chunk instead of get. How can I implement it for this query?

$collection = Contact::with('shop');
$collection = $collection->orderBy('created_at', 'DESC');
$collection = $collection->get();
$json = $collection->map(function ($contact) {
    return [ 
        'id' => $contact->id,
        'name' => $contact->name,   
        ...about 50 columns more.
        'shop' => [
                'id' => optional($contact->shop)->id,
                'name' => optional($contact->shop)>name
        ],  
        ...about 6 relations more.
    ];
});
$json = $json->paginate(50);
return response()->json(['contacts' => $json], 200);

CodePudding user response:

You are converting getting all the data like 1M or how many records it has. Then you are mapping it and paginate it and getting only 50. There is huge performance problem with your code.

You can directly call like this:

return response()->json(['contacts' => Contact::with('shop')->orderBy('created_at', 'DESC')->paginate(50)], 200);

If you only need id and name for contacts:

return response()->json(['contacts' => Contact::select('id', 'name', 'created_at')->orderBy('created_at', 'DESC')->paginate(50)], 200);
  • Related