Home > Software design >  Index each json file then combine into one json with jq
Index each json file then combine into one json with jq

Time:12-06

I have a directory of json files, for example:

mock1.json

[
    { "name": "John" },
    { "name": "Mary" }
]

mock2.json

[
    { "name": "Nick" },
    { "name": "Luke" }
]

I learned that

jq -r 'to_entries'

will transform the format to:

[
    {
        "key": 0,
        "value": { "name": "John" }
    },
    {
        "key": 1,
        "value": { "name": "Mary" }
    }
]

I also learned that:

jq -s add [PATH/*.json]

Combines all json files in specified path to one json object.

However, I'm struggling to index the json files then combine them. Something like:

[
    {
        "key": 0,
        "value": { "name": "John" }
    },
    {
        "key": 1,
        "value": { "name": "Mary" }
    },
    {
        "key": 0,
        "value": { "name": "Nick" }
    },
    {
        "key": 1,
        "value": { "name": "Luke" }
    }
]

Thanks in advance.

CodePudding user response:

Here are two ways to do it:

cat mock[12].json | jq -s 'map(to_entries) | flatten'

Or:

jq to_entries mock[12].json | jq -s flatten

CodePudding user response:

Slurp the contents, and use the to_entries() builtin

jq -s 'add|to_entries' mock[12].json

which is functionally the same as writing

jq -n '[inputs]|add|to_entries' mock[12].json
  • Related