Here is a console command I wrote in Laravel, and I was wondering how can I make it better and faster? What have I done wrong?
And if you can introtuce me a book so I can get better at algorithms and writing better code I would appreciate it.
Thanks
$process = Process::create([
'started_at' => now()
]);
//set how many subscription to get from db in each iteration
//we use this for loop to prevent memory exhaustion
$subscriptionsCountPerIteration = 1000;
$subscriptionsCount = Subscription::count();
$numberOfIterations = $subscriptionsCount/$subscriptionsCountPerIteration;
for($i=0; $i < $numberOfIterations; $i ){
$subscriptions = Subscription::query()->limit($subscriptionsCountPerIteration)->offset($subscriptionsCountPerIteration * $i)->get();
foreach($subscriptions as $subscription){
$subscription->updateStatus($process);
}
}
$process->finished_at = now();
$process->save();
I think the fact that I have two loops is a bad Idea.
CodePudding user response:
I believe one nested loop is uninevitable in this case, because otherwise the update can't be executed in batch. However Laravel has its own functionality to handle this that can be read here. I suggest that you test wether to use chunk()
or chunkById()
because the two functions handle the batch differently.
CodePudding user response:
You can "optimize" the speed of the command execution, by running the updateStatus()
logic inside of a Job. This means the command would asynchronously dispatch a new job for checking the status, and return quickly, since the actual logic is being done in the worker queue.
It would look something like this:
foreach($subscriptions as $subscription){
CheckSubscriptionStatusJob::dispatch($subscription->id);
}
Note: The actual checking of the statuses, will still take just as long, but the command wont need to wait for the checking to complete. Also, you can use multiple queues to execute multiple jobs in parallel.