I am using express js, I have an asynchronous function that takes about 20 - 30 seconds to make some work. After it makes that work it increments a counter for the user in the db. The user must wait at least 24 hours before making an other request.
My function checks the last time the counter of the user incremented before starting the work that takes about 30 secs. It only works if last update in the db is made before more than 24 hours.
What if a user sends multiple requests at once in a very short time (say send 5 requests in the same second), will the function start the work 5 times and increment his counter 5 times as a result (because when it checks for last update during that time it will find it happened more than 24 as the update of all the requests sent at same seconds is not written in the db yet) ? or it will only process requests one by one and will not process a second request until the first one returns a response and ends ? How can I prevent that problem from happening ?
I want the api to process asynchronous requests from the same user ONE by ONE.
CodePudding user response:
Will the function start the work 5 times and increment his counter 5 times as a result?
Yes.
Or will it only process requests one by one and will not process a second request until the first one returns a response and ends?
No, nodejs is designed to be asynchronous, handling multiple requests at the same time (assuming the request handler is asynchronous itself, like in your case).
How can I prevent that problem from happening?
Check and increment the counter before doing the work.
Also, make that check-and-update an atomic transaction inside your database, so that there are absolutely no race conditions between multiple requests. Otherwise you still would run into issues with multiple checks happening before the updates, you only made them less likely by reducing the time window from around 30 seconds to a few milliseconds.
I want the API to process asynchronous requests from the same user ONE by ONE.
You could do that by introducing a queue for each user. However, that will make horizontal scaling harder and also it's not exactly a good practice to wait many seconds before responding to a request, that'll likely cause a timeout on the client. Instead I would recommend responding immediately with a 409 (Conflict) or 429 (Too many request) HTTP status code while another request is already being processed, and make sure in the client logic that requests are only sent one after the other.