Home > Software engineering >  Laravel dispatch same job with chain with different id and delay
Laravel dispatch same job with chain with different id and delay

Time:01-01

i am new to php/laravel, trying to understand the dispatch method.

I use dispatch in a long loop of several thousand of data, in the end of the loop he is run :

RunDispatch::dispatch($id)->delay($Time);

dispatch is therefore executed thousands of times, the variable $Id and $Time is different for each one, that is the problem, i would like to know how to run all dispatch at once with chain from the same connection whit a different 'id' and 'delay' AFTER the main loop, to start i build an array at the end of the loop :

array_push($ArrayToDispatch, array($id => $Time));

I therefore get an array with all the $id and $Time, like this :

Array
(
    [0] => Array
        (
            [id1] => Time1
        )
)
Array
(
    [1] => Array
        (
            [id2] => Time2
        )
)

I would like to know if it is possible to insert in chain with different id and delay for each one, something like :

RunDispatch::dispatch(array_keys($ArrayToDispatch))->delay(array_value($ArrayToDispatch))

Each array_keys and array_value are different, i tried to manipulate the method chain() and withChain(), something like:

RunDispatch::chain([
....,
])->dispatch();

The problem is that i don't know how to insert a different $id and $delay with these methods, i tried several combinations without success.

thank you all

CodePudding user response:

I my opinion you can use batches. You can read about it here

Here is a simple example from the documentation

 $batch = Bus::batch([
    new ImportCsv(1, 100),
    new ImportCsv(101, 200),
    new ImportCsv(201, 300),
    new ImportCsv(301, 400),
    new ImportCsv(401, 500),
])->then(function (Batch $batch) {
    // All jobs completed successfully...
})->catch(function (Batch $batch, Throwable $e) {
    // First batch job failure detected...
})->finally(function (Batch $batch) {
    // The batch has finished executing...
})->dispatch();

You can see that we have dispatched multiple jobs together, and after completion, there are callbacks where you can start the collect your array.

With batches, your jobs will run parallel. Also if you want, you can add the delay for each job

(new ImportCsv(1, 100))->delay($time)

You can generate array of job objects then run them passing in batch method

    $jobs = [];
    //Here you can start the loop by your data
    foreach ($myArray as $item) {
        $jobs[] = (new ImportCsv($item['id']))->delay($item['time']); //This is just example
    }
    // Here you can pass the jobs. Jobs will start only after batching.
    $batch = Bus::batch($jobs)->then(function (Batch $batch) {
        // All jobs completed successfully...
    })->catch(function (Batch $batch, Throwable $e) {
        // First batch job failure detected...
    })->finally(function (Batch $batch) {
        // The batch has finished executing...
    })->dispatch();

CodePudding user response:

thank you @Aro your solution works, on the other hand, with your solution the delay is not respected, all the jobs are sent at the same time (i don't know for what reason), so i used another method to deal with my problem, after having the array with all the data i run a dispatch with the redis driver (much faster than mysql) on a new job which is responsible for doing the mysql insertions in background with chunk methode to not overload the mysql server, which makes the script instantaneous.

In the controller, i fill the array :

array_push($ArrayToDispatch, array($id => $Time));

then :

NewJob::dispatch($ChunkInsertToMs)->onConnection('redis');

Inside the new job (NewJob) :

public function handle()
{
    $chunks = collect($this->ChunkInsertToMs)->chunk(500);
    foreach ($chunks as $chunk) {
        foreach ($chunk as $item) {
            RunDispatch::dispatch($id)->delay($item["Time"]);
        }
    }

}

thank you all

  • Related