i try to improve performance in my code:
example code:
for($i = 0; $i<= 10000; $i ){
//do stuff
$test = DB::table('test')->where('test2', $i)->get();
//do stuff
}
this example code runs: 3 seconds.
Now i tried to optimize the code with setting the sql outside the loop like:
$test = DB::table('test')->whereBetween('test2', [ 0, 10000 ])->get();
for($i = 0; $i<= 10000; $i ){
//do stuff
$test = $test->where('test2',$i);
//do stuff
}
but this code runs: 5 seconds. Are the 10000 sqls really faster then one sql and 10000 collection wheres?
i log it with: Log::info(round(microtime(true) * 1000));
CodePudding user response:
You are, in this case, running 1 query and instantiating a collection with 10k objects looping that entire collection 10k times to find a single entity.
Vs.
Sending 10k queries to a hyper-optimised, indexed database solution built for that exact purpose and getting back a single entity.
On top of that, your queries aren't looking at the same thing, so naturally, it will be even slower.
Simply asking a database solution to give entries WHERE X = Y
is always going to be faster than asking the same in the format of WHERE X > 0 AND X < 10000
The slow down comes from instantiating a collection of 10000 entities and looping them repeatedly 10000 times while trying to find where test2 = id
.
Whereas with SQL, you can find that much faster than you can by looping it in PHP.
In short, you could rephrase your question:
is it faster to query a database 10k times or to loop a collection of 10k entities 10k times in plain PHP while comparing the value of each record to an ID?
To which SQL becomes the more obvious answer.