Home > Mobile >  Reading a JSON file uses a lot of memory in node
Reading a JSON file uses a lot of memory in node

Time:11-14

I am a bit surprised that reading a 274.9MB json file and storing it on a variable as array of objects causes a 1.1GB usage of Resident Set Size memory in node.js.

How can this be? it seems a bit excessive.

import { readFile } from 'fs/promises'

const raw = await readFile('big.json', 'utf8')
const file = JSON.parse(raw)

console.log('Length: ', file.length)
console.log(
  `Memory: ${Math.round((process.memoryUsage().rss / 1024 / 1024) * 100) / 100} MB`
)
Length: 920885
Memory: 1193.05 MB

Here are the contents of one object as an example

{
  keyword: '1 hour circuit training',
  url: 'https://www.pinterest.com/pin/457467274629879495',
  rank: 1,
  page: 1,
  type: 'inline_images',
  title: 'image result',
  domain: '-',
  sitelinks: false
}
.... 

CodePudding user response:

This is about the expected size. At least half a gig is being taken up by the string content of big.json - JavaScript uses UCS-2, so each character must necessarily take 2 bytes. Note you're not freeing it (no delete statement), so it's still referred-to from the stack when you take the measurement. Hard to tell exactly what the memory layout of an "array of objects" is, but objects, being hash maps, do have some overhead. If they contain strings, then again, count each character double. All in all, this memory usage is realistic and not entirely unexpected.

  • Related