Home > OS >  How Node.JS handles multiple requests without blocking?
How Node.JS handles multiple requests without blocking?

Time:12-19

I have been using Node.JS for a while and just wonder how it handles when multiple clients causing some blocking / time consuming works to response ?

Consider the following situation 1- There are many endpoints and one of them is time consuming and responds in a few seconds. 2- Suppose, 100 clients simultaneously make requests to my endpoints, which one of them takes a considerable amount of time.

Does that endpoint block all event loop and make other requests wait ?

Or , In general, Do requests block each other in Node.JS ?

If not , why ? It is single-threaded, why do not they block each other ?

CodePudding user response:

Node.Js does use threads behind the scenes to perform I/O operations. To be more spesific to your question - there will be a limit where a client will have to wait for an idle thread to perform a new I/O task.

You can make an easy toy example - running several I/O tasks concurrently (by using Promise.all for instance) and measure the time it takes for each to finish. Then add a new task and repeat. At some point you'll notice two groups. For example 4 requests took 250ms and the other 2 took 350ms (and there you get "requests blocking each other").

Node.Js is commonly refered as single threaded for its default CPU-operations excecution (in contrary to its Non-blocking I/O architecture). therefore it won't be very wise using it for intensive CPU operations, but very efficient when it comes to I/O operations.

  • Related