My database.json file looks like this:
{
"opened_tickets": [
{
"userid": "4",
"ticketid": "customer_info",
"opened_timestamp": "1662543404514"
},
{
"userid": "2",
"ticketid": "customer_info",
"opened_timestamp": "1662543405860"
},
{
"userid": "4",
"ticketid": "customer_info",
"opened_timestamp": "1662543411026"
}
],
"tickets_counter": {
"account_problems": "1",
"customer_info": "4",
"payment_problems": "6",
"other": "8"
}
}
I want to delete objects in opened_tickets array where userid and ticketid are the same and then save the data to the same file. So in this case file databases.json should look like this:
{
"opened_tickets": [
{
"userid": "2",
"ticketid": "customerinfo",
"opened_timestamp": "1662543405860"
}
],
"tickets_counter": {
"account_problems": "1",
"customer_info": "4",
"payment_problems": "6",
"other": "8"
}
}
How can I do this in node.js?
CodePudding user response:
So you group the items into smaller arrays by the key (which is a combo of userid and ticketid). Lastly you take those of the array that are of length==1 meaning no duplicates.
var obj={opened_tickets:[{userid:"4",ticketid:"customer_info",opened_timestamp:"1662543404514"},{userid:"2",ticketid:"customer_info",opened_timestamp:"1662543405860"},{userid:"4",ticketid:"customer_info",opened_timestamp:"1662543411026"}],tickets_counter:{account_problems:"1",customer_info:"4",payment_problems:"6",other:"8"}}
var trimmed = Object.values(obj.opened_tickets.reduce(function(agg, item) {
var key = item.userid "|" item.ticketid
agg[key] = agg[key] || []
agg[key].push(item)
return agg;
}, {})).filter(function(item) {
return item.length == 1
}).map(function(item) {
return item[0]
})
obj.opened_tickets = trimmed;
console.log(obj)
.as-console-wrapper {max-height: 100% !important}
CodePudding user response:
First you'll need to read the contents of the file. In Node.js, you can do this using the fs module. The docs are pretty good, but there are multiple ways of doing things, so it can be a bit overwhelming at first.
The three main ways of performing filesystem operations in JavaScript are as follows:
- Synchronously: this is probably the easiest way to proceed, but it has the side effect of blocking JavaScript's event loop. Unless you're creating a program that needs to do multiple things at once (ie a webserver where multiple clients will be connected at any given time), you don't need to worry about this.
- Using Callbacks: this way uses functions that handle the result of the filesystem operation, so after the call to read the file is made, the program can carry on execution, and execute your callback function when the result is ready. However, this restricts you to only using the file contents within the callback function, and often leads to "callback hell"
- Using Promises: I believe this is the most modern way to do fs operations, but it can take a while to get your head around (it definitely did for me).
Below are examples of how to read a file using each method:
Synchronously:
const fs = require("fs");
// filepath will be the path to database.json, and
// encoding will most likely be "utf-8"
let contents = fs.readFileSync(filepath, encoding);
// do something with contents
Using Callbacks:
const fs = require("fs");
// filepath will be the path to database.json, and
// encoding will most likely be "utf-8"
fs.readFile(filepath, encoding, function (err, data) {
if (err !== null) {
// there was an error
} else {
// do something with data, which will be
// the file contents as a string
}
});
// However, you cannot use the file contents here,
// only inside the function, thus you can see how
// callback hell happens
Using Promises:
const fs = require("fs").promises;
// filepath will be the path to database.json, and
// encoding will most likely be "utf-8"
fs.readFile(filepath, encoding)
.then(function (data) {
// do something with data
})
.catch(function(err) {
// there was an error
});
Now that you've read your file and have the contents, you can use JSON.parse(contents)
to load the string into an object. From there you can use a for loop to filter out your unwanted array items.
// ... getting file contents
let contents_object = JSON.parse(contents_string);
let used_pairs = {};
let filtered_tickets_array = [];
for (let ticket of contents_object["opened_tickets"]) {
let {userid, ticketid} = ticket;
if (used_pairs[userid] === undefined) {
// we haven't had this userid yet
used_pairs[userid] = [ticketid];
} else if (used_pairs[userid].includes(ticketid)) {
// we have already had this pair
// so go to the next iteration
continue;
} else {
// we've had this userid but not with this ticketid
used_pairs[userid].push(ticketid);
}
// if we reach this point then we have an new
// userid/ticketid pair
filtered_tickets_array.push(ticket);
}
After this, filtered_tickets_array
will have a list of tickets with unique userid
/ticketid
pairs. We now have to put this new array under the opened_tickets
key on our contents_object
and turn it back into a JSON string using JSON.stringify(contents_object)
.
contents_object["opened_tickets"] = filtered_tickets_array;
let new_contents_string = JSON.stringify(contents_object);
Finally, we have to write this back to the file. Once again there are three paths you can choose, and I will demonstrate each of them.
Synchronously:
const fs = require("fs");
// filepath will be the path to database.json, and
// encoding will most likely be "utf-8"
fs.writeFileSync(filepath, new_contents_string, encoding);
Using Callbacks:
const fs = require("fs");
// filepath will be the path to database.json, and
// encoding will most likely be "utf-8"
fs.writeFile(filepath, new_contents_string, encoding, function (err) {
if (err !== null) {
// there was an error
} else {
// file wrote successfully
}
});
Using Promises:
const fs = require("fs").promises;
// filepath will be the path to database.json, and
// encoding will most likely be "utf-8"
fs.writeFile(filepath, new_contents_string, encoding)
.then(function () {
// file wrote successfully
})
.catch(function(err) {
// there was an error
});