Home > Net >  Node stream returns RangeError only for large files
Node stream returns RangeError only for large files

Time:09-16

I need to stream 1 billion decimal numbers of PI, then do something (just for testing purpose, I'm getting the length). When I read a file with "1 million" of digits, it returns me the length normally but when reading the file with "1 billion" it returns "RangeError: Invalid string length". Why is this happening if apparently there's nothing wrong with the code?

import fs from "fs";

let pi = '';

const readStream = fs.createReadStream('pi-billion.txt', 'utf-8');

readStream.on('data', (chunk) => pi  = chunk);

readStream.on('error', (error) => console.log(`error: ${error.message}`));

readStream.on('end', () => console.log(pi.length));

CodePudding user response:

Node/v8 has maximum string length

$ node -pe 'buffer.constants.MAX_STRING_LENGTH'
536870888

Buffers can be larger

$ node -pe 'buffer.constants.MAX_LENGTH'       
4294967296

You could leave the buffer/string in chunks (or append the chunks into regular intervals to ease processing) and store them in an array, then the next limit should be nodes available memory.

const pi_chunks = []
let pi_length = 0

readStream.on('data', (chunk) => {
  pi_chunks.push(chunk)
  pi_length  = chunk.length
})
  • Related