Home > database >  Node readFile inside a Loop Memory overflow
Node readFile inside a Loop Memory overflow

Time:06-07

I have a folder with 98 .csv files that I need to read and save on a Postgres database. I'm running this code:

// Files in folder
const filesInFolder = fs.readdirSync('../pages-csv');

for (let i = 0; i < filesInFolder.length; i  ) {
    console.log(`Saving file ${filesInFolder[i]}`)
    await fs.promises.readFile(path.join(__dirname, `../pages-csv/${filesInFolder[i]}`), 'utf8').then(async (data) => {
        await createOne(data) // database function
    })
}

But it leads to this error:

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory

It seems like files are not closing or not being flushed from memory after inserting in database.

CodePudding user response:

Have you tried streaming it? Something like:

var csv = require('csv-parser')
var data = []

for (let i = 0; i < filesInFolder.length; i  ) {
    console.log(`Saving file ${filesInFolder[i]}`)
    enter code here

    // add the 'file path here'
    fs.createReadStream(filesInFolder[i])
      .pipe(csv())
      .on('data', function (row) {
        createOne(row)
      })
      .on('end', function () {
       console.log('End cvs')
      })
}




Or maybe there is a way to clear the garbage collection: global.gc(), after you expose it: node --expose_gc

CodePudding user response:

I found the problem, it was the createOne function, it had a forEach loop, removed it and now it works!

  • Related