I have made a script below. The script is checking responde code of every URL listed in the for example .csv file in column A. Everything works as I planed but after checking all URL`s the script is freezed. I have to do ctrl c combination to stop it. How can I make script automaticly end the run after all URL's are checked.
#!/bin/bash
for link in `cat $1` $2;
do
response=`curl --output /dev/null --silent --write-out %{http_code} $link`;
if [ "$response" == "$2" ]; then
echo "$link";
fi
done
CodePudding user response:
Your script hangs due to $2
in the for link
line (when it hangs, check ps aux | grep curl
and you'll find a curl process with the response code as the last argument). Also, for link in `cat $1` $2
is not how you should read and process lines from a file.
Assuming your example.csv
file only contains one URL per row and nothing else (which basically makes it a plain text file), this code should do what you want:
#!/usr/bin/env bash
while read -r link; do
response=$(curl --output /dev/null --silent --write-out %{http_code} "$link")
if [[ "$response" == "$2" ]]; then
echo "$link"
fi
done < "$1"
CodePudding user response:
Copying your code verbatim and fudging a test file with a few urls separated by whitespace to test with, it does indeed hang. However, removing the $2
from the end of the for
allows the script to finish.
for link in `cat $1`;