Bash here. My myscript.sh
:
#!/bin/bash
startTime=$(date %s)
curl --location --request GET 'http://www.example.com'
endTime=$(date %s)
callDuration=$(expr $endTime - $startTime)
echo "$startTime"
echo "$endTime"
echo "$callDuration"
So I'm expecting it to capture a start and end time (unix epoch timestamp; number of seconds since Jan 1, 1970), subtract start from end, and give me the number of seconds it takes to make the call via curl
. Then I want it to print that duration result to standard out.
In reality, I'm hitting a RESTful endpoint which should return a large, complex JSON object.
When I run it:
bash ~/myscript.sh
{"json":"from-endpoint-returned-here"}1666255857
1666255857
0
I can't make sense of this output. I'm not sure why its printing the JSON coming back from the endpoint (undesirable), but the real problem here is: why is the start time correct, and why is the end time 0
?
CodePudding user response:
The json output comes from curl. Without -o
, it outputs the response contents to the standard output.
The first number is the start time, the second number is the end time. As they are the same, the difference is zero, which means the request took less than a second.