I want to inject a command into a pipeline that just validates the input at that point, passing it on if it's valid. If it isn't valid nothing could be passed on, or perhaps a custom-defined error message from an option or something.
At the moment I'm using perl for this as in this example where I check for an expected unique match for $1 in a file:
grep -P '^\s*\Q$(strip $1)\E\s ' file_codes.txt \
| perl -e '@in = <STDIN>;' \
-e '@in == 1 or die "wrong number of matches";' \
-e 'print @in' \
| xargs \
| ...
I don't like this because it seems both un-pipeish and un-perlish with the explicit read and print involving @in. It seems like there out to be something like tee or so that does it but I didn't find it.
CodePudding user response:
grep ... |
perl -e'
defined($line = <>) && !defined(<>)
or die("Wrong number of matches\n");
print $line;
' |
xargs ...
The Perl program outputs a line to STDOUT if and only if there's only one line of input. If there isn't exactly one line of input, it outputs nothing to STDOUT and an error message to STDERR.
The Perl program reads as little as possible. This mean that both perl
and grep
might end earlier and thus less use CPU and disk.
The line breaks inside and outside of the Perl program can be left in or removed.
CodePudding user response:
Test the current input line number, $.
(note that it should be exactly 1, so it is tested twice):
% echo foo | perl -pe 'die "wrong number of matches" if $. > 1; END { die "wrong number of matches" if $. < 1; }' | xargs
foo
% echo "foo\nbar" | perl -pe 'die "wrong number of matches" if $. > 1; END { die "wrong number of matches" if $. < 1; }' | xargs
wrong number of matches at -e line 1, <> line 2.
foo
% cat /dev/null | perl -pe 'die "wrong number of matches" if $. > 1; END { die "wrong number of matches" if $. < 1; }' | xargs
wrong number of matches at -e line 1.
END failed--call queue aborted.