Is there a way in Linux taking a terminal/SSH stream as an input to another program.
I can get the following to work without issue:
#!/bin/bash
FIFO="$1"
ps auwx | grep "[m]rLogger 192.168.10.10"
if [ $? -eq 0 ]; then
exit
fi
while [ 1 ]
do
/usr/bin/sshpass -p MrLoggertemp /usr/bin/ssh -t -t -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -l mrlogger 192.168.10.10 >> "$FIFO"
echo "sshpass failed. restarting"
sleep 1
done
.. and it will stream to a file
But I want to stream this to a perl script that I want to do some processing and then load into a dataset.
Is there a way to do the following in the perl script like:
my $fifo_fh;
my $fifo_file = <The output of the stream above>
open($fifo_fh, " < $fifo_file") or die
Thanks..
CodePudding user response:
I am not sure exactly what your command "stream"s or what exactly you need with it.
But yes a Perl script can act as a filter, in a number of ways.
If you want to pipe a stream into the scirpt then read it from STDIN
(perl's handle for fd 0)
use warnings;
use strict;
use feature 'say';
while (my $line_in = <STDIN>) {
chomp $line_in; # remove newline
# ... # process
say $line_in; # prints for demo
}
Now with the command
ls -l . | script.pl
the lines of output from ls -l .
are processed in the script.pl
(only printed above).
So with your example instead of cmd >> "$FIFO"
have cmd | script.pl
and in the script write (append) to the file, after suitable processing.
If you'd rather do all this from inside a script, a basic way is to use a "pipe-open"
use warnings;
use strict;
use feature 'say';
my @cmd = qw(ls -l .);
open(my $in, '-|', @cmd) or die "Can't pipe-open @cmd: $!";
while (my $line_in = <$in>) {
chomp $line_in;
# ...
say $line_in;
}
With the command passed as a list (@cmd
) it is assumed that the first element of @cmd
is the program to run, which is directly invoked and the rest of @cmd
is passed to it as its arguments. This way a shell is altogether avoided, even if there are shell metacharacters in the command.
But if the command is meant to use a shell then write it as a string, or pass it as "@cmd"
, where quoting interpolates @cmd
into a string with spaces between elements. See linked docs.
Then, there are libraries that facilitate and improve this. I'd first recommend IPC::Run; it is the most complex one but also by far the most powerful, allowing your program to run a mini-shell almost.†
Let me know how this meshes (or not) with your uses so that we can adjust it, if needed.
When a loop condition is written as while (<$fh>)
then a line that it reads, via <>
operator from a filehandle $fh
, gets assigned to an omnipresent default, $_
variable. (If we explicitly assign to a variable, like I do above for clarity, then the deal is off and $_
is undefined.)
That variable, $_
, is a default for many other operators, including chomp
and say
.
Then examples above can be written as
while (<$in>) {
chomp;
... # process $_, which has the line of input
say;
}
This can lead to very lean and readable code, if used correctly. But if we end up needing a lot of explicit uses of $_
in the loop body (not everything takes it by default!), or end up with cryptic code, then please by all means introduce and use a nice lexical variable.
† Basic uses are simple.
CodePudding user response:
Another skeleton approach use Linux names pipes.
It is real time communication via remote server.
https://en.wikipedia.org/wiki/Named_pipe
mkfifo streamFile # make pipe file.
exec 3<> streamFile # change descriptor to not block r/w
# /* Here is if any line appears on streamFile it will pipe to Perl.
# Whatever content appears on remote /var/log/messages pass to $line
cat streamFile - | ssh user@server 'tail -f /var/log/messages' | while read line ; do
echo $line | perl; # /* of course it fail, becasue perl expect differnt syntax. \
# /* You can change another file which tail file for perl lang
done