Home > Enterprise >  Dump binary file contents in a JSON array
Dump binary file contents in a JSON array

Time:10-29

I have a binary file that I would like to pack as a JSON array like so:

{
  "content": [0, 23, 45,...]
}

Right now I dump the file using hexdump into a separate file (to print as unsigned u8's with commas)and manually paste those contents in the array:

hexdump -ve '1/1 "%u," foo.bin > foo_arr

Looking for a better way to achieve this preferably over the command line (jq, standard *nix tools), JavaScript could work as well but I'd rather avoid it.

CodePudding user response:

Here's one option:

hexdump -ve '1/1 "%u\n"' foo.bin | jq -s '{content: .}'

Here, I use jq's -s flag ("slurp") to read in all the lines of stdin as a single array, the simply use that array as the value of content.

For example:

$ python -c 'open("foo.bin", "wb").write(b"abc")'
$ hexdump -ve '1/1 "%u\n"' foo.bin | jq -s '{content: .}'
{
  "content": [
    97,
    98,
    99
  ]
}

CodePudding user response:

Using perl:

$ printf "\x00\x17\x2d" > foo.bin
$ perl -0777 -nE '@bytes = map { ord } split //, $_;
                  $" = ","; # Delimiter when inserting an array into a string
                  say qq/{"content":[@bytes]}/' foo.bin
{"content":[0,23,45]}

Reads the entire file at once (-O777 -n) and splits it into an array of bytes, and then outputs the JSON with those byte values.

CodePudding user response:

jq can read raw input using -R (or --raw-input) and convert it to its codepoint numbers using the explode builtin:

jq -Rs '{content: explode}' foo.bin

If the binary file is not too large, you can also read it into a variable using --arg and then apply explode on that.

jq -n --arg bin "$(cat foo.bin)" '{content: $bin | explode}'

Note: jq operates on Unicode codepoints, whereas your current hexdump approach translates single byte counts, so the results may differ to this regard.

  • Related