Home > Software design >  How can I replace every matching JSON field's value with incremental value?
How can I replace every matching JSON field's value with incremental value?

Time:11-01

So I have a large json file (approximately 20k hosts) and for each host, I need to find FieldA and replace it's value with a unique value which I can then swap back later.

For instance:

root> cat file.json | jq .


[
   {
   "id": 1,
   "uptime": 0
   "computer_name": "Computer01"
   },
   {
   "id": 2,
   "uptime": 0
   "computer_name": "Computer02"
   }
]

I need to iterate through this list of 20k hosts, replace every computer_name with a dummy value:

[
   {
   "id": 1,
   "uptime": 0
   "computer_name": "Dummy01"
   },
   {
   "id": 2,
   "uptime": 0
   "computer_name": "Dummy02"
   }
]

And if possible, export the dummy value and original value to a table side by side linking them up.

The dummy values I want to generate automatically such as: for each computer_name replace value with Dummy?????? where ????? is a number from 00000 to 99999 and it just iterates through this.

I attempted to use: cat file.json | jq .computer_name OR jq.computer_name file.json to filter this down and then work on replacing the values, but when I use .computer_name as the value, I get this error:

jq: error : Cannot index array with string "computer_name".

Thanks in advance.

CodePudding user response:

It's not clear what you mean exactly by "exporting the dummy value and original value", but the following should be sufficient for you to figure out that detail, since the result contains the information necessary to create the table:

  def lpad($len; $fill): tostring | ($len - length) as $l | ($fill * $l)[:$l]   .;

  . as $in
  | ([length|tostring|length, 5]|max) as $width
  | [range(0; length) as $i
     | $in[$i] 
     | .   {computer_name: ("Dummy"   ($i | lpad($width;"0"))), 
            original_name: .computer_name}]

You could use this to create your table, and then remove original_name e.g., by running:

map(del(.original_name))

Of course there are other ways to achieve your stated goal, but if leaving the original names in the file is an option, then that might be worth considering, since it might obviate the need to maintain a separate table.

CodePudding user response:

I would first generate a master table containing both the clear and the obfuscated names, then extract from it as needed either the protected version by removing the clear names, or a table with the matchings. You can even perform direct lookups on it:

jq 'with_entries(.key |= "Dummy\("\(.)" | "0"*(5-length)   .)" | .value  = {
  clear_name: .value.computer_name,
  computer_name: .key
})' file.json > master.json
cat master.json
{
  "Dummy00000": {
    "id": 1,
    "uptime": 0,
    "computer_name": "Dummy00000",
    "clear_name": "Computer01"
  },
  "Dummy00001": {
    "id": 2,
    "uptime": 0,
    "computer_name": "Dummy00001",
    "clear_name": "Computer02"
  }
}
jq 'map(del(.clear_name))' master.json
[
  {
    "id": 1,
    "uptime": 0,
    "computer_name": "Dummy00000"
  },
  {
    "id": 2,
    "uptime": 0,
    "computer_name": "Dummy00001"
  }
]
jq -r '.[] | [.clear_name, .computer_name] | @tsv' master.json
Computer01  Dummy00000
Computer02  Dummy00001
jq --arg lookup "Dummy00001" '.[$lookup]' master.json
{
  "id": 2,
  "uptime": 0,
  "computer_name": "Dummy00001",
  "clear_name": "Computer02"
}
jq -r --arg lookup "Dummy00001" '.[$lookup].clear_name' master.json
Computer02
  • Related