So I have a little bash script:
for store in {4..80}
do
for page in {1..200}
do
curl 'https://website.com/shop.php?store_id='$store'&page='$page'&PHPSESSID=sessionID' -O
done
done
The script is working but downloads all 200 store pages of all stores from 4-80 one after another, which takes a lot of time. (Notice the bash variable store and page in the curl URL)
My goal would be to run as many curl requests simultaneously for each store/page instead getting it worked on one after one, to save time.
Is this possible?
CodePudding user response:
curl
can run loops itself. This limits curl
to 100 simultaneously sessions:
curl 'https://website.com/shop.php?store_id=[4-80]&page=[1-200]&PHPSESSID=sessionID' -O --parallel --parallel-max 100
CodePudding user response:
Try changing you script as follow:
for store in {4..80}; do
for page in {1..200}; do
curl 'https://website.com/shop.php?store_id='$store'&page='$page'&PHPSESSID=sessionID' -O &
done
done
# Wait for spawned cURLs...
wait