i have an array of urls like (coming out of a jq call):
["https://example.com/text.txt", "https://example.com/text2.txt", "https://example.com/text3.txt"]
how to loop through this and curl each of these urls into the current folder?
edit:
for link in $(jq -c 'xxxx' file.txt); do
echo "$link"
done
CodePudding user response:
if you are calling a command, to get the list of links, you can simply iterate over the result of the command, something like:
for link in "$(jq <args...>)"; do
# do something here
done
or:
while read -r link; do
# do something here
done <<< $(jq <args...>)
CodePudding user response:
As the JSON array is "coming out of a jq call", just have jq output it as a list of raw-text urls using .[]
and the -r
option, then pipe that list into xargs
using curl
.
jq -r '… | .[]' | xargs -I{} curl -O {}
If you cannot modify the original jq filter, insert another call to jq to your pipe:
… | jq -r '.[]' | xargs -I{} curl -O {}
CodePudding user response:
Add the following to the end of your jq code, and pipe to sh
to run it:
jq -r '[ "curl", .[] ] | @sh' | sh
The
@sh
filter correctly escapes output for a shellA single shell command is an array/list, delimited by whitespace:
cmd opts ... args ...
, and curl can take multiple URL arguments.So in jq, you can prepend the
curl
command (and any options) as extra elements in the array, and filter through@sh
, to generate a correctly quoted shell command.Note that the order of elements in a JSON array is preserved, which is what we want -
curl
first.You can run jq without the pipe, to inspect what will be executed:
For example:
echo '["https://example.com/text.txt", "https://example.com/text2.txt", "https://example.com/text3.txt"]' |
jq -r '[ "curl", .[] ] | @sh'
Should produce:
'curl' 'https://example.com/text.txt' 'https://example.com/text2.txt' 'https://example.com/text3.txt'
The quotes around curl
are fine. This output can be piped (in the shell) to sh
, to execute.