I have a long-running job in which, if failed I want to permanently delete it totally so that a job should not be retried or attempted.
here is what i try
//queue.php
'database-large-reports-on-web-server' => [
'driver' => 'database',
'table' => 'jobs',
'queue' => 'ten_minuite_queue_web_server',
'retry_after' => 1800,
],
here is how i dispatch the job
OrdersExportJob::dispatch()->onConnection('database-large-reports-on-web-server');
// myJob
class OrdersExportJob implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
public $timeout = 400;
public $tries = 1;
public function handle() { ... }
}
//My supervisor configuration
[program:laravel-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/ymyapp/artisan queue:work database-large-reports-on-web-server
autostart=true
autorestart=true
stopasgroup=true
killasgroup=true
user=root
numprocs=8
redirect_stderr=true
stdout_logfile=/var/log/worker.log
stopwaitsecs=9999
Result: the job kept reattempted after it gets failed.
CodePudding user response:
how about manual delete job on fail?
class OrdersExportJob implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
// ....
public function failed(Throwable $exception)
{
$this->delete();
}
delete()
is part of use Illuminate\Contracts\Queue\Job
, and can be accessable if Illuminate\Queue\InteractsWithQueue
trait is used
CodePudding user response:
You should remove $tries option from your job, You do not need it if you do not want your job to be re-attempted.
CodePudding user response:
Try adding a unique id to every job, and storing a list of every id you’ve seen in the db.
Then, discard every job that’s in the list.