Home > Net >  In NodeJs, Why my last forked worker process handle request every time, Although 7 worker process re
In NodeJs, Why my last forked worker process handle request every time, Although 7 worker process re

Time:02-18

I want to use cluster module to run express server side by side using worker process. Here is my complete script.

const express = require('express');
const cluster = require('cluster');
const os = require('os');


const totalCPUs = os.cpus().length;
const PORT = 3000;


if (cluster.isMaster) {

  console.log(`Number of logical cpus is available: ${totalCPUs}`);

  console.log(`Master ${process.pid} is running`);

  // Fork workers
  for (let i = 0; i < totalCPUs; i  ) {
    cluster.fork();
  }

  cluster.on('exit', (worker, code, signal) => {
    console.log(`worker ${worker.process.pid} died`);
    console.log("Let's fork another worker!");
    cluster.fork();
  });
} else {

  const app = express();
  console.log(`Worker ${process.pid} started`);
  app.get('/', (req, res) => {
    res.send(`Performance example: ${process.pid} `);
  });

  function spin(duration) {
    const startTime = Date.now();

    while (Date.now() - startTime < duration) {
      // block the event loop
    }
  }
  app.get('/delay', (req, res) => {
    // after some delay
    spin(9000);
    res.send(`Ding ding ding! ${process.pid}`);
  });

  app.listen(PORT, () => {
    console.log(`Worker process ${process.pid} Listening on ${PORT}`);
  });
}

Whenever I open multiple tab in browser and hit same http://localhost:3000/delay api then all request run in series by only last created child process. How would I use remain all child process ?

CodePudding user response:

If you're making multiple of the exact same request from the browser, then the browser itself may be holding your 2nd request until the 1st one returns (probably in hopes of using a cached result). If you want to defeat this browser "feature", you can add a unique query parameter to each URL with something like this:

const mainUrl = "http://localhost:3000/delay";
const uniqueUrl = mainURL   "?r="   Math.random();

Then, send your request from the browser to uniqueUrl. That will keep the browser from serializing them in the hopes of using a previously cached result. Generate a new uniqueUrl for each request you send from the browser.

Your server will ignore the query parameter, but the browser will think each URL is different and thus will not for the 2nd to wait for the 1st to respond and so on, allowing you to get some requests to the same route in parallel. Keep in mind that the browser still has rules about how many simultaneous requests it will make to the same host so you still won't necessarily get all requests in parallel, but you should get some.


Or, use separate actual clients to send each request so a single browser client won't be blocking you.

Or, send the multiple requests from a client (like a nodejs client test app) that doesn't do the "hold for caching" that the browser does.

  • Related