Home > OS >  Bash command works when I run it myself but fails in the script
Bash command works when I run it myself but fails in the script

Time:06-28

My company has a tool that dynamically generates commands to run based on an input json. It works very well when all arguments to the compiled command are single words, but is failing when we attempt multi word args. Here is the minimal example of how it fails.

# Print and execute the command.
print_and_run() { local command=("$@")
    if [[ ${command[0]} == "time" ]]; then
        echo "Your command: time ${command[@]:1}"
        time ${command[@]:1}
    fi
}

# How print_and_run is called in the script
print_and_run time docker run our-conainer:latest $generated_flags

# Output
Your command: time docker run our-container:latest subcommand --arg1=val1 --arg2="val2 val3"
Usage: our-program [OPTIONS] COMMAND1 [ARGS]... [COMMAND2 [ARGS]...]...
Try 'our-program --help' for help.

Error: No such command 'val3"'.

But if I copy the printed command and run it myself it works fine (I've omitted docker flags). Shelling into the container and running the program directly with these arguments works as well, so the parsing logic there is solid (It's a python program that uses click to parse the args).

Now, I have a working solution that uses eval, but my entire team jumped down my throat at that suggestion. I've also proposed a solution using delineating characters for multi-word arguments, but that was shot down as well.

No other solutions proposed by other engineers have worked either. So can I ask someone to perhaps explain why val3 is being treated as a separate command, or to help me find a solution to get bash to properly evaluate the dynamically determined command without using eval?

CodePudding user response:

Your command after expanding $generated_flags is:

print_and_run time docker run our-conainer:latest subcommand --arg1=val1 --arg2="val2 val3"

Your specific problem is that in --arg2="val2 val3" the quotes are literal, not syntactical, because quotes are processed before variables are expanded. This means --arg2="val2 and val3" are being split into two separate arguments. Then, I assume, docker is trying to interpret val3" as some kind of docker command because it's not part of any argument, and it's throwing out an error because it doesn't know what that means.

Normally you'd fix this via an array to properly maintain the string boundary.

generated_flags=( "subcommand" "--arg1=val1" "--arg2=val2 val3" )
print_and_run time docker run our-container:latest "${generated_flags[@]}"

This will maintain --arg2=val2 val3 as a single argument as it gets passed into print_and_run, then you just have to expand your command array correctly inside the function (make sure to quote the expansion).

CodePudding user response:

The question is:

why val3 is being treated as a separate command

Unquoted variable expansion undergo word splitting and filename expansion. Word splitting splits the result of the variable expansion on spcaes, tabs and newlines. Splits it into separate "words".

a="something        else"
$a  # results in two "words"; 'something' and 'else'

It is irrelevent what you put inside the variable value or how many quotes or escape sequences you put inside. Every consecutive spaces splits it into words. Quotes " ' and escapes \ are parsed when part of the input line, not when part of the result of unquoted expansion.

help me find a solution to

Write a parser that will actually parse the commands and split it according to the rules that you want to use and then execute the command split into separate words. For example, a very crude such parser is included in xargs:

$ echo " 'quotes  quotes'  not quotes" | xargs printf "'%s'\n"
'quotes  quotes'
'not'
'quotes'

For example, python has shlex.split which you can just use, and at the same time introduce python which is waaaaay easier to manage than badly written Bash scripts.

tool that dynamically generates commands to run based on an input json

Overall, the proper way forward would is to upgrade the tool to generate a JSON array that represents the words of the command to be executed. Than you can just execute that array of words, which is, again, trivial to do properly in python with json and subprocess.run, and will require some gymnastics with jq and read and Bash arrays in shell.

Check your scripts with shellcheck.

  •  Tags:  
  • bash
  • Related