I have a json-file with keys and values that I would like to add to a bash array, but I cannot make it work.
1. Creating JSON-file with data
# JSON file
echo '{"persons": [{"name":"John", "age":30, "car":"big"}, {"name":"Alice", "age":29, "car":"big"}]}' > my.json
2. Iterating over Name and Age
# Looping over JSON
jq -c '.persons[]' my.json | while read p;
do
name=$(echo $p | jq -r '.name')
age=$(echo $p | jq -r '.age')
echo "Name: $name"
echo "Age: $age"
done
Names and ages are printed out as expected.
3. Manually adding Name and Age to Array
# Creating array and adding elements
declare -A mylookup
key="James"
value="31"
mylookup[$key]=$value
4. Checking if values were added
for key in "${!mylookup[@]}"
do
echo "Key: $key"
echo "Value: ${lookup[$value]}"
done
I oddly only shows the Key correctly
5. Adding Name and Age to Array in loop
# Creating array and adding elements
declare -A mylookup
jq -c '.persons[]' my.json | while read p;
do
name=$(echo $p | jq -r '.name')
age=$(echo $p | jq -r '.age')
mylookup[$name]=$age
done
6. Printing out keys and values in Array
Now I check what was added to the array.
for key in "${!mylookup[@]}"
do
echo "Key: $key"
echo "Value: ${lookup[$value]}"
done
Nothing is printed out..... where am I failing? I suspect something is going wrong when I am adding the keys and values to the array, but I cannot find out what.
CodePudding user response:
Array values are getting lost because you are using a pipeline here. Since while loop is running in a forked sub-shell therefore all the variables are being set in the forked subshell only and you are seeing no changes in the parent shell.
You may use this to avoid this:
declare -A mylookup
while IFS=$'\t' read n a; do
mylookup[$n]="$a"
done < <(jq -r '.persons[] | "\(.name)\t\(.age)"' my.json)
# check array content
declare -p mylookup
declare -A mylookup=([Alice]="29" [John]="30" )
< <(...)
is process substitution
CodePudding user response:
The usual way would be to make jq
output a format that the shell can read accurately. Here's an example with the lossless TSV format:
#!/bin/bash
declare -A mylookup
while IFS=$'\t' read -r name age
do
printf -v name '%b' "$name"
printf -v age '%b' "$age"
mylookup["$name"]=$age
done < <(
echo '{"persons": [{"name":"John", "age":30, "car":"big"}, {"name":"Alice", "age":29, "car":"big"}]}' |
jq -r '.persons[] | [ .name, .age ] | @tsv'
)
declare -p mylookup
#=> mylookup='([Alice]="29" [John]="30" )'
remark: The printf -v ... '%b' ...
are meant to unescape the TSV values; they're probably unnecessary in your case
CodePudding user response:
Alternatively, you can use jq
to output a valid bash syntax for Bash's declare -A
statement.
With proper @sh
filtering in jq
, it ensures no invalid or unintended execution can occurs from the JSON data.
This method saves the use of a shell iteration which is very inefficient.
#!/usr/bin/env bash
json_file=my.json
if dyn_declare=$(
jq --raw-output '"(",(.persons[]|"[" (.name|@sh) "]=" (.age|@sh)),")"' \
"$json_file"
) && declare -A mylookup=$dyn_declare; then
# Do something useful with the populated associative array
# Here we just dump it for debug/demo purpose
declare -p mylookup
else
printf 'Could not parse %s into an associative array\n' \
"$json_file" >&2
exit 1
fi