Home > Mobile >  How can you pass a multi-line output from one command as n arguments to a single command, not n comm
How can you pass a multi-line output from one command as n arguments to a single command, not n comm

Time:07-08

TL;DR

Given the input

a
b
c

I am trying to execute command foo with each input as a separate argument (e.g. 'n' inputs means a single call to foo with 'n' arguments)...

foo a b c

I am not trying to execute foo once per argument (e.g. 'n' inputs means 'n' calls to foo each with a single argument iteration)

# Not this!
foo a
foo b
foo c

Can this be done outside of string concatenation with eval?

Summary

I'm trying to find the recommended way to pass the multi-line output of one command as individual arguments to a second command. To be clear, I am not trying to execute that second command, once per argument. I'm trying to pass all the arguments to a single command instance.

This is easier demonstrated than explained.

Consider this shell command with a hard-coded list of three swift source files (note some have spaces and are in subfolders)...

xcrun -sdk macosx swiftc -o "$OUTPUT_FILE" "main.swift" "car.swift" "Models/Road Bike.swift"

It's pretty straightforward and executes as expected.

But... I'm trying to automate things so I want that list of source files to be the result of a recursive find operation.

Here's the find command I started with...

find * -type f -name "*.swift"

Executing that yields the following results...

Models/Road Bike.swift
car.swift
main.swift

Attempt 1

As such, for my first attempt, I tried this...

SOURCE_FILES=$(find * -type f -name "*.swift" | tr '\n' ' ')
xcrun -sdk macosx swiftc -o "$OUTPUT_FILE" $SOURCE_FILES

...but it didn't work. I (incorrectly) thought it was because the filenames weren't properly quoted.

Attempt 2

Because of the above I changed it to this to wrap each filename accordingly...

SOURCE_FILES=$(find * -type f -name "*.swift" | while read fn; do echo \"$fn\"; done | tr '\n' ' ')
xcrun -sdk macosx swiftc -o "$OUTPUT_FILE" $SOURCE_FILES

This did properly add the quotes to the filenames, but that's when I realized my earlier error... it's not treating it as separate arguments, one per filename, it's treating it as a single argument with one giant filename with embedded quotes (and last time it was one giant filename without embedded quotes).

Attempt 3 - Success!

Ok, that's when I realized I could go brute-force and just build the exact command I wanted as a literal string, then execute it using eval. Armed with that knowledge, I finally went with this...

SOURCE_FILES=$(find * -type f -name "*.swift" | while read file; do echo \"$file\"; done | tr '\n' ' ')
cmd='xcrun -sdk macosx swiftc -o "$OUTPUT_FILE" '$SOURCE_FILES
eval $cmd

...and sure enough it worked. BUT... I can't help but feel this is workaround after workaround after workaround.

Is there a simpler way to pass the output from the find (or ls, etc.) command as individual input arguments to the compile (or any other) command? How else can this be solved? What's the recommended/preferred way to do this?

Note: I'm using zsh but if it's also bash-compatible, that would be good too.

CodePudding user response:

You're trying too hard.

find * -type f -name "*.swift" -print0 | xargs -0 xcrun -sdk macosx swiftc -o "$OUTPUT_FILE" 

CodePudding user response:

first

Save yourr multi-line output in an array

second

Pass the array to a desired command

example - test

Run line by line

# cd into a test directory
cd `mktemp -d`

# generate list of files
touch file-{01..09}.txt

# save the list in an array   remove trailing newlines 
mapfile -t list < <(find -type f -name \*.txt)

# check the output
echo "${lis[@]}"

# pass the array to a command you liked 
# for example substitute txt with TXT
sed 's/txt/TXT/g' <<< "${list[*]}"

# output
./file-09.TXT ./file-08.TXT ./file-07.TXT ./file-06.TXT ./file-05.TXT ./file-04.TXT ./file-03.TXT ./file-02.TXT ./file-01.TXT
  • Related