Home > OS >  How to pass variables with special characters into a bash script when called from terminal
How to pass variables with special characters into a bash script when called from terminal

Time:05-29

Hello all I have a program running on a linux OS that allows me to call a bash script upon a trigger (such as a file transfer). I will run something like:

/usr/bin/env bash -c "updatelog.sh '${filesize}' '${filename}'"

and the scripts job is to update the log file with the file name and file size. But if I pass in a file name with a single quote in its file name then it will break the script and give an error saying "Unexpected EOF while looking for matching `''"

I realize that a file name with a single quote is making the calling command an invalid one since the single quote is messing with the command itself. However I don't want to sanitize the variables if I can help it cause I would like my log to have the exact file name being displayed to easier cross reference it later. Is this possible or is sanitizing the only option here?

Thanks very much for your time and assistance.

CodePudding user response:

Sanitization is absolutely not needed.

The simplest solution, assuming your script is properly executable (has x permissions and a valid shebang line), is:

./updatelog.sh "$filesize" "$filename"

If for some reason you must use the bash -c, use single quotes instead of double quotes surrounding your code, and keep your data out-of-band from that code:

bash -c 'updatelog.sh "$@"' 'updatelog' "$filesize" "$filename"

Note that only updatelog.sh "$@" is inside the -c argument and parsed as code, and that this string is in single quotes, passed through without any changes whatsoever.

Following it are your arguments $0, $1 and $2; $0 is used when printing error messages, while $1 and $2 go into the list of arguments -- aka $@ -- passed through to updatelog.sh.

  • Related