Home > Mobile >  Multi threading cURL request?
Multi threading cURL request?

Time:01-01

I want to generate list of numbers and make a http request and save the result using cURL and jq on ubuntu.

I already wrote single thread using FOR loop and it's works but I want to add threading to my script to works faster.

    for i in `seq 12345 12550 `; do echo $i ;curl -s -k -X $'GET' \
-H $'Host: example.com' -H $'User-Agent: Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36' -H $'Accept: application/json, text/plain, */*' -H $'Accept-Encoding: gzip, deflate' -H $'Accept-Language: en-US,en;q=0.9,fa;q=0.8' -H $'Connection: close' \
$'https://example.com/pages/json/'$i'/list' | jq ".files[]"; done

How can I add threading or parallels to my script to speed up my process?

My goal is make a bash script that receive arguments for numbers range and threads counts from user and save the result for each request to the specific path by requested number name.

$script.sh -numbers 12345-12550 -threads 100 -outpath ~/result/

CodePudding user response:

The simplest thing would be to take the script you have and make it take inputs to run once, then create another script that calls that one with various values.

For example:

Script1.sh

echo "%1 Hello World" >> output.txt

Script2.sh

 for name in $namesList ;do
        ./Script1.sh $name &
 done
 wait


cat output.txt
Dave Hello World
Tom Hello World
... 

Note: this is more of a beginners approach but from there you can call functions like this etc and make it all one script.

See document on Jobs: https://tldp.org/LDP/abs/html/x9644.html

CodePudding user response:

doit() {
  i="$1"
  echo $i
  curl -s -k -X $'GET' \
    -H $'Host: example.com' -H $'User-Agent: Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36' -H $'Accept: application/json, text/plain, */*' -H $'Accept-Encoding: gzip, deflate' -H $'Accept-Language: en-US,en;q=0.9,fa;q=0.8' -H $'Connection: close' \
    $'https://example.com/pages/json/'$i'/list' |
  jq ".files[]"
}
export -f doit

seq 12345 12550 | parallel -j100 --results ~/result/{} doit
  • Related