Home > Blockchain >  Nodejs parallel requests are getting slower
Nodejs parallel requests are getting slower

Time:10-28

I got a problem with node where for every network request I add to a parallel queue, the time until i get a network response increases by some milliseconds, is that normal ? If so please tell me why, should they not take all the same time to respond?

Setup:
Local response server serves always the same 300KB JSON response.
Way of querying:

const Q = require('q')
let queue = []
for (var i = 100; i > 0; i--) {
    queue.push(getDamageInfo())
}

Q.allSettled(queue)
.then((result) => {})
const Q = require('q')
const request = require('request')

function getDamageInfo() {
    id = '79d568d6-b820-40b4-845a-02228dcde338'
    url = 'http://localhost:3001/damage/'
    let deferred = Q.defer()
    let start = new Date()
    request.get(url   id , {
        'auth': {
            'bearer': token
        }
    }, (err, res, body) => {
        let end = new Date() - start
        console.log('get-damage-info: '   id   ' %dms', end)
        if(res.statusCode != 200) {
            deferred.reject({statusCode: res.statusCode, error: err })
        } else {
            deferred.resolve(JSON.parse(body))
        }
    })

    return deferred.promise
}

console output:

get-damage-info: 79d568d6-b820-40b4-845a-02228dcde338 43ms
get-damage-info: 79d568d6-b820-40b4-845a-02228dcde338 44ms
get-damage-info: 79d568d6-b820-40b4-845a-02228dcde338 46ms
get-damage-info: 79d568d6-b820-40b4-845a-02228dcde338 48ms
get-damage-info: 79d568d6-b820-40b4-845a-02228dcde338 51ms
.
.
.
get-damage-info: 79d568d6-b820-40b4-845a-02228dcde338 223ms
get-damage-info: 79d568d6-b820-40b4-845a-02228dcde338 224ms
get-damage-info: 79d568d6-b820-40b4-845a-02228dcde338 225ms

I don't understand why the response time already increases by almost 10 ms with 5 parallel requests from request 1 to request 5. It happens with all tried libraries (https, axios, request).

Thank you in advance and have a nice day

CodePudding user response:

Actually, it is expected. You would see the same result with any tech, that is natural that the more concurrent requests you send to a server the more time it eventually needs to process them.

Your test code sends 100 concurrent requests.

You need to understand that Node.js also has a limited amount of resources and that in the end libuv the library that implements async for Node.js uses a thread pool, and that there is a queue of jobs that it has to do. So the more work in the queue the more time you need to wait before your job is done. There are actually multiple queues, but that is an implementation detail.

Also, in the end, there are some hardware limitations, e.g. you have all 4 cores in your CPU, so all the work is performed by your CPU. When you have more work to do by your computer it doesn't increase the number of Cores but decreases the overall response time.

  • Related