Home > front end >  Handle Multiple Concurent Requests for Express Sever on Same Endpoint API
Handle Multiple Concurent Requests for Express Sever on Same Endpoint API

Time:09-25

this question might be duplicated but I am still not getting the answer. I am fairly new to node.js so I might need some help. Many have said that node.js is perfectly free to run incoming requests asynchronously, but the code below shows that if multiple requests hit the same endpoint, say /test3, the callback function will:

  1. Print "test3"
  2. Call setTimeout() to prevent blocking of event loop
  3. Wait for 5 seconds and send a response of "test3" to the client

My question here is if client 1 and client 2 call /test3 endpoint at the same time, and the assumption here is that client 1 hits the endpoint first, client 2 has to wait for client 1 to finish first before entering the event loop.

Can anybody here tells me if it is possible for multiple clients to call a single endpoint and run concurrently, not sequentially, but something like 1 thread per connection kind of analogy.

Of course, if I were to call other endpoint /test1 or /test2 while the code is still executing on /test3, I would still get a response straight from /test2, which is "test2" immediately.

app.get("/test1", (req, res) => {
  console.log("test1");
  setTimeout(() => res.send("test1"), 5000);
});

app.get("/test2", async (req, res, next) => {
  console.log("test2");
  res.send("test2");
});

app.get("/test3", (req, res) => {
  console.log("test3");
  setTimeout(() => res.send("test3"), 5000);
});

CodePudding user response:

This question needs more details to be answer and is clearly an opinion-based question. just because it is an strawman argument I will answer it.

first of all we need to define run concurrently, it is ambiguous if we assume the literal meaning in stric theory nothing RUNS CONCURRENTLY

CPUs can only carry out one instruction at a time.

The speed at which the CPU can carry out instructions is called the clock speed. This is controlled by a clock. With every tick of the clock, the CPU fetches and executes one instruction. The clock speed is measured in cycles per second, and 1c/s is known as 1 hertz. This means that a CPU with a clock speed of 2 gigahertz (GHz) can carry out two thousand million (or two billion for those in the US) for the rest of us/world 2000 million cycles per second. cpu running multiple task "concurrently"

yes you're right now-days computers even cell phones comes with multi core which means the number of tasks running at the same time will depend upon the number of cores, but If you ask any expert such as this Associate Staff Engineer AKA me will tell you that is very very rarely you'll find a server with more than one core. why would you spend 500 USD for a multi core server if you can spawn a hold bunch of ...nano or whatever option available in the free trial... with kubernetes.

Another thing. why would you handle/configurate node to be incharge of the routing let apache and/or nginx to worry about that.

as you mentioned there is one thing call event loop which is a fancy way of naming a Queue Data Structure FIFO

so in other words. no, NO nodejs as well as any other programming language out there will run

but definitly it depends on your infrastructure.

CodePudding user response:

For those who have visited, it has got nothing to do with blocking of event loop.

I have found something interesting. The answer to the question can be found here.

When I was using chrome, the requests keep getting blocked after the first request. However, with safari, I was able to hit the endpoint concurrently. For more details look at the following link below.

GET requests from Chrome browser are blocking the API to receive further requests in NODEJS

  • Related