How to download a script from URL, execute it, and pipe something into it - all with one command?
I have a shell script to which I can pipe stuff:
$ echo "Test Data" | do-stuff.sh
The script is moved to http://example.com/do-stuff.sh
Try to do something like this:
$ echo "Test Data" | ( curl -sfL http://example.com/do-stuff.sh | sh )
but it doesn't work. Data does not get piped into the script.
There is a workaround, but how to do it with one line without creating files?
$ curl -sfL http://example.com/do-stuff.sh > do-stuff.sh
$ chmod x do-stuff.sh
$ echo "Test Data" | do-stuff.sh
$ rm do-stuff.sh
CodePudding user response:
Assuming the link with the script contains raw data without any transformation, then you can try this approach
echo "Test Data" | bash <(wget -O - http://example.com/do-stuff.sh)
CodePudding user response:
Try
echo "Test Data" | sh <(curl -sfL http://example.com/do-stuff.sh)
This needs to be run in bash or zsh or ksh due to the process substitition.
The reason why yours isn't working is that the sh
process's stdin channel is already consumed by reading the source code, so you can't pipe data into it.
The process substitution acts like a file, so sh
can get the source from the "file" and still read data from stdin.