Home > Blockchain >  How to run synchronously multiple PHP scripts avoiding timeout?
How to run synchronously multiple PHP scripts avoiding timeout?

Time:04-21

I saw some answers explaining how to run multiple PHP scripts synchronously from one PHP file with exec function. For example :

<?php
exec("script1.php");
exec("script2.php");
?>

But this is not what I want. My problem is that these scripts are doing intensive SQL requests so if I do that the calling PHP script will die with timeout. Increasing the timeout is not a solution. I don't want to execute these scripts asynchronously in order to avoid to overload the MySQL database at the same time.

So I was thinking to do it in Javascript or maybe with a cron job. In JS I would like to execute script2.php only if script1.php is finished and so on. With JQuery I could do something like that:

<script>
myfunction();

function myfunction()
{
   promise = longfunctionfirst().then(shortfunctionsecond);
}
function longfunctionfirst()
{
   d = new $.Deferred();
fetch("script1.php", {})
    .then( async response => await response.json())
    .then(function(data) {
        //Process data
    });
   setTimeout('alert("first function finished");d.resolve()',3000);
   return d.promise()
}
function shortfunctionsecond()
{
   d = new $.Deferred();
fetch("script2.php", {})
    .then( async response => await response.json())
    .then(function(data) {
        //Process data
    });
   setTimeout('alert("second function finished");d.resolve()',200);
   return d.promise()
}
</script>

It could be ok if I have only 2 scripts but I have more than 10.

Do you think I can find an effective solution with JS, jQuery or with cron jobs?

CodePudding user response:

exec runs that command inside a shell and it's not very smooth eating resources and adding security concerns.

The Javascript strategy you are chasing it's just half of the story. That's just the client side invoking different endpoints in sequence waiting for the previous one to have finished responding.

Those endpoints will need to be resolved by your server side application and those requests will still be subject to timeout.

Since those scripts are supposed to take long time and are not suitable to run inside the span of an HTTP request, there's no other way but run them async in respsect to the response that will be sent to the request initiator.

I'm not telling you to run 1,2,3 out of sequence... I'm just saying that even one of them takes too long and that's what requires to be detached from the response.

So one of the many solutions you could approach, it's one suggested here on SO:

    ignore_user_abort(true);//not required
    set_time_limit(0);
    
    ob_start();
    // do initial processing here
    echo $response; // send the response
    header('Connection: close');
    header('Content-Length: '.ob_get_length());
    ob_end_flush();
    @ob_flush();
    flush();
    fastcgi_finish_request();//required for PHP-FPM (PHP > 5.3.3)
    
    // now the request is sent to the browser, but the script is still running
    // so, you can continue...
    
    // In particular your calls to the functions entry point
    // in your scripts still respecting their sequence.
    // Before the sequence ends, you should write the status somewhere 
    // (like in a file or in the DB) so that you'll have another endpoint
    // that will check that status and will tell you 
    // if the process finished or still running.


    die(); //a must especially if set_time_limit=0 is used and the task ends

But yet that's not exactly what you asked. You still insisted to run those fetch commands in sequence and maybe it shouldn't be my business to push you toward a different approach. I tried fixing your js code so that it will run correctly how you expected. Unfortunately doing fetches like that is very unfriendly unless you rely on your own cors-proxy and I couldn't craft a working example here on the snippet.

I can try to explain a working approach though. If you factor that function running the fetch in a way that it will be async and general (you will call it with await if you can't easily deal with asyncronous events), when the general success callback will be invoked, it will need to know which is the url fetched and where it belongs to the list. To make sure such callback will see those information, you could for example put them in an object in the global scope. To declare such a variable you can do it from any scope without using let/const/var before the assignment to the variable.

I will be heavily downvoted for giving such an advice but it's the easiest way to communicate a solution without having the chance to test my code.

The object will just contain: a list of scripts to run with their urls, the cursor where you are staying processing your i-th url in the list and, in case you want, a different callback for each script.

Then you just call with await your function passing it the list and the cursor set to zero. Now everytime you get into the success callback, it will fetch the url at the cursor position, process its data, increment the cursor and call again that function until the cursor got equal to the length of the urls list.

This was a very informal solution and not very detailed with examples. I hope you've got the point of my suggestions and you can get somewhere with your problems.

CodePudding user response:

One more way you can try is run the script in background using exec

exec("php script.php >/dev/null &");

  • Related