Home > database >  How to handle high response time
How to handle high response time

Time:11-24

There are two different services. One service -Django is getting the request from the front-end and then calling an API in the other service -Flask.

But the response time of the Flask service is high and if the user navigates to another page that request will be canceled.

Should it be a background task or a pub/sub pattern? If so, how to do it in the background and then tell the user here is your last result?

CodePudding user response:

You have two main options possible:

  • Make an initial request to a "simple" view of Django, which load a skeleton HTML page with a spinner where some JS will trigger a XHR request to a second Django view which will contain the other service (Flask) call. Thus, you can even properly alert your user the loading takes times and handle the exit on the browser side (ask confirmation before leaving/abort the request...)

  • If possible, cache the result of the Flask service, so you don't need to call it at each page load.

You can combine those two solutions by calling the service in a asynchronous request and cache its result (depending on context, you may need to customize the cache depending on the user connected for example). The first solution can be declined with pub/sub, websockets, whatever, but a classical XHR seems fine for your case.

CodePudding user response:

On our project, we have a couple of time-expensive endpoints. Our solution was similar to a previous answer: Once we receive a request we call a Celery task that does its expensive work in async mode. We do not wait for its results and return a quick response to the user. Celery task sends its progress/results via WebSockets to a user. Frontend handles this WS message. The benefit of this approach is that we do not spend the CPU of our backend. We spend the CPU of the Celery worker that is running on another machine.

  • Related