I need to write multiple dynamically changing files based on an array consisting of objects passed to a custom writeData() function. This array consists of objects containing the file name and the data to write as shown below:
[
{
file_name: "example.json",
dataObj,
},
{
file_name: "example2.json",
dataObj,
},
{
file_name: "example3.json",
dataObj,
},
{
file_name: "example4.json",
dataObj,
},
];
My current method is to map this array and read write new data to each file:
array.map((entry) => {
fs.readFile(
entry.file_name,
"utf8",
(err, unparsedData) => {
if (err) console.log(err);
else {
var parsedData = JSON.parse(unparsedData);
parsedData.data.push(entry.dataObj);
const parsedDataJSON = JSON.stringify(parsedData, null, 2);
fs.writeFile(
entry.file_name,
parsedDataJSON,
"utf8",
(err) => {
if (err) console.log(err);
}
);
}
}
);
});
This however, does not work. Only a small percent of data is written to these files and often times the file is not correctly written in json format (I think this is because two writeFile processes are writing to the same file at once and that breaks the file). Obviously this does not work the way I expected it to.
The multiple ways I have tried to resolve this problem consisted of attempting to make the fs.writeFile synchronous (delay the map loop, allowing each process to finish before moving to the next entry), but this is not a good practice as synchronous processes hang up the entire app. I have also looked into implementing promises but to no avail. I am a new learner to nodejs so apologies for missed details/information. Any help is appreciated!
CodePudding user response:
The same file is often listed multiple times in the array if that changes anything.
Well, that changes everything. You should have shown that in the original question. If that is the case, then you have to sequence each individual file in the loop so it finishes one before advancing to the next. To prevent conflicts between writing to the same file, you have to assure yourself of two things:
- You sequence each of the files in the loop so the next one doesn't start until the previous one is done.
- You don't call this code again while its still in operation.
You can assure yourself of the first item like this:
async function processFiles(array) {
for (let entry of array) {
const unparsedData = await fs.promises.readFile(entry.file_name, "utf8");
const parsedData = JSON.parse(unparsedData);
parsedData.data.push(entry.dataObj);
const json = JSON.stringify(parsedData, null, 2);
await fs.promise.writeFile(entry.file_name, json, "utf8");
}
}
This will abort the loop if it gets an error on any of them. If you want it to continue to write the others, you can add a try/catch
internally:
async function processFiles(array) {
let firstError;
for (let entry of array) {
try {
const unparsedData = await fs.promises.readFile(entry.file_name, "utf8");
const parsedData = JSON.parse(unparsedData);
parsedData.data.push(entry.dataObj);
const json = JSON.stringify(parsedData, null, 2);
await fs.promise.writeFile(entry.file_name, json, "utf8");
} catch (e) {
// log error and continue with the rest of the loop
if (!firstError) {
firstError = e;
}
console.log(e);
}
}
// make sure we communicate back any error that happened
if (firstError) {
throw firstError;
}
}
To assure yourself of the second point above, you will have to either not use a setInterval()
(replace it with a setTimeout()
that you set when the promise that processFiles()
resolves or make absolutely sure that the setInterval()
time is long enough that it will never fire before processFiles()
is done.
Also, make absolutely sure that you are not modifying the array
used in this function while that function is running.