Home > other >  Pass large array of objects to RabbitMQ exchange
Pass large array of objects to RabbitMQ exchange

Time:12-06

I receive large array of objects from an external source (about more than 10 000 objects). And then I pass it to exchange in order to notify other microservices about new entries to handle.

this._rmqClient.publishToExchange({
    exchange: 'my-exchange',
    exchangeOptions: {
        type: 'fanout',
        durable: true,
    },
    data: myData, // [object1, object2, object3, ...]
    pattern: 'myPattern',
})

The problem is that it's bad practice to push such large message to exchange, and I'd like to resolve this issue. I've read articles and stackoverflow posts about that to find code example or information about streaming data but with no success.

The only way I've found out is to divide my large array into chunks and publish each one to exchange using for ... loop. Is it good practice? How to determine what length each chunk (numbers of objects) should have? Or maybe is there another approach?

CodePudding user response:

It really depends on the Object size.. That's a thing you would have to figure out yourself. Get your 10k objects and calculate an average size out of them (Put them as json into a file and take fileSize/10'000 that's it. Maybe request body size of 50-100kb is a good thing? But that's still up to u ..

Start with number 50 and do tests. Check the time taken, bandwidth and everything what makes sense. Change chunk sizes from between 1-5000 and test test test . At some point, you will get a feeling what number would be good to take! .

Here's some example code of looping through the elements:


// send function for show caseing the idea.
function send(data) {
    return this._rmqClient.publishToExchange({
        exchange: 'my-exchange',
        exchangeOptions: {
            type: 'fanout',
            durable: true,
        },
        data: data,
        pattern: 'myPattern',
    })
}

// this sends chunks one by one.. 
async function sendLargeDataPacket(data, chunkSize) {

    // Pure functions do prevent headache
    const mutated = [...data]

    // send full packages aslong as possible!.
    while (mutated.length >= chunkSize) {

        // send a packet of chunkSize length
        await send(mutated.splice(0, chunkSize))

    }

    // send the remaining elements if there are any!
    if(mutated.length > 0) {
        await send(mutated)
    }

}

And you would call it like:

// that's your 10k  items array!.
var myData = [/**...**/]

// let's start with 50, but try out all numbers!.
const chunkSize = 50
sendLargeDataPacket(chunkSize).then(() => console.log('done')).catch(console.error)

This approach send one packet after the other, and may take some time since it is not done in parallel. I do not know your requirements but I can help you writing a parallel approach if you need..

  • Related