I have a code, which reades .csv line by line and saves each parsed row to the database
const csv = require('csv-parse')
const errors = []
csv.parse(content, {})
.on('data', async function (row: any) {
const error = await tryToSaveToDatabase(row);
if (error) {
errors.push(error)
}
})
.on('end', function () {
// somehow process all errors
})
but, unfortunately, .on('end', ...
block is beeing called earlier then all await
block succeded.
I have read NodeJs Csv parser async operations - seems we cannot use await
inside .on('data', ...
callback.
What is the correct way to perform such thing - if I want to read .csv line by line (files might be very huge - so it must be performed in a streaming manner) and collect some errors while saving to database? (these errors are displayed on frontend then)
CodePudding user response:
https://csv.js.org/parse/api/async_iterator/ This solution reads .csv line by line