Is it possible on such a pipe line
tee stdin.log | subject-command | tee stdout.log
to halt an execution once subject-command
is finished.
Instead as I experience it the first tee
proceeds while its income stream goes on.
CodePudding user response:
Behavior of tee
varies by operating system. When given multiple explicit file outputs it keeps going until all of them have failed; whether it should exit immediately when stdout failed is somewhat open to interpretation in the standard defining its behavior. (I'd argue that failing to exit in this case is a bug and should be reported to your OS vendor, but that doesn't help you when you have a problem now).
Assuming bash 4.1 or newer, and that you're processing line-oriented textual content that doesn't contain any NULs:
mytee() {
local arg new_fd
local -a fds=( )
# open our output files
for arg in "$@"; do
exec {new_fd}>"$arg"
fds =( "$new_fd" )
done
# loop over input until first failure
while IFS= read -r line; do
for fd in "${fds[@]}"; do
printf '%s\n' "$line" >&"$fd" || break
done
printf '%s\n' "$line" || break
done
# close our output files
for fd in "${fds[@]}"; do
exec {fd}>&-
done
}
...the problem described in this question will no longer occur with:
mytee stdin.log | subject-command | tee stdout.log
CodePudding user response:
Ignoring the issues specific to tee
, we can work around this with process substitutions (another ksh/zsh/bash feature not available in sh).
This too requires a relatively recent bash release and isn't compatible with sh
(as process substitutions didn't set $!
until release 4.3 or so).
#!/usr/bin/env bash
# ^^^^- only bash, not sh
exec {stdin_log_tee_fd}< <(tee stdin.log); stdin_log_tee_pid=$!
subject-command <&$stdin_log_tee_fd | tee stdout.log
kill "$stdin_log_tee_pid"