I am executing a chain of curl
commands:
- I need to echo the command before the execution.
- Execute the command and save the result to a bash variable.
- Get values from the result of the execution and execute the next curl with that values.
This is how it looks like:
# -----> step 1 <-----
URL="https://web.example.com:8444/hello.html"
CMD="curl \
--insecure \
--dump-header - \
\"$URL\""
echo $CMD && eval $CMD
OUT="<result of the curl command???>"
# Set-Cookie: JSESSIONID=5D5B29689EFE6987B6B17630E1F228AD; Path=/; Secure; HttpOnly
JSESSIONID=$(echo $OUT | grep JSESSIONID | awk '{ s = ""; for (i = 2; i <= NF; i ) s = s $i " "; print s }' | xargs)
# Location: https://web.example.com:8444/oauth2/authorization/openam
URL=$(echo $OUT | grep Location | awk '{print $2}')
# -----> step 2 <-----
CMD="curl \
--insecure \
--dump-header - \
--cookie \"$JSESSIONID\" \
\"$URL\""
echo $CMD && eval $CMD
OUT="<result of the curl command???>"
...
# -----> step 3 <-----
...
I only have a problem with the step 2
: save the full result of the curl
command to a variable in order to I can parse it.
I have tried it many different way, non of them works:
OUT="eval \$CMD"
OUT=\$$CMD
OUT=$($CMD)
- ...
What I missed?
CodePudding user response:
For very basic commands, OUT=$($CMD)
should work. The problem with this is, that strings stored in variables are processed differently than strings entered directly. For instance, echo "a"
prints a
, but var='"a"'; echo $a
prints "a"
(note the quotes). Because of that and other reasons, you shouldn't store commands in variables.
In bash, you can use arrays instead. By the way: The naming convention for regular variables is NOT ALLCAPS, as such names might accidentally collide with special variables. Also, you can probably drastically simplifiy your grep | awk | xargs
.
url="https://web.example.com:8444/hello.html"
cmd=(curl --insecure --dump-header - "$url")
printf '%q ' "${cmd[@]}"; echo
out=$("${cmd[@]}")
# Set-Cookie: JSESSIONID=5D5B29689EFE6987B6B17630E1F228AD; Path=/; Secure; HttpOnly
jsessionid=$(awk '{$1=""; printf "%s%s", d, substr($0,2); d=FS}' <<< "$out")
# Location: https://web.example.com:8444/oauth2/authorization/openam
url=$(awk '/Location/ {print $2}' <<< "$out")
# -----> step 2 <-----
cmd=(curl --insecure --dump-header - --cookie "$jsessionid" "$url")
printf '%q ' "${cmd[@]}"; echo
out=$("${cmd[@]}")
# -----> step 3 <-----
...
If you have more steps than that, wrap the repeating part into a function, as suggested by Charles Duffy.
CodePudding user response:
Easy Mode: Use set -x
Bash has a built-in feature, xtrace
, which tells it to log every command to the file descriptor named in the variable BASH_XTRACEFD
(by default, file descriptor 2, stderr).
#!/bin/bash
set -x
url="https://web.example.com:8444/hello.html"
output=$(curl \
--insecure \
--dump-header - \
"$url")
echo "Output of curl follows:"
echo "$output"
...will provide logs having the form of:
url=https://web.example.com:8444/hello.html
curl --insecure --dump-header - https://web.example.com:8444/hello.html
output=Whatever
echo 'Output of curl follows:'
echo Whatever
...where the
is based on the contents of the variable PS4
, which can be modified to have more information. (I often use and suggest PS4=':${BASH_SOURCE}:$LINENO '
to put the source filename and line number in each logged line).
Doing It By Hand
If that's not acceptable, you can write a function.
log_and_run() {
{ printf '%q ' "$@"; echo; } >&2
"$@"
}
output=$(log_and_run curl --insecure --dump-header - "$url")
...will write your curl command line to stderr before storing its output in $output
. Note when writing that output that you need to use quotes: echo "$output"
, not echo $output
.
CodePudding user response:
I guess OUT=$(eval $CMD)
will do what you want.