Home > front end >  How to 'Follow' a File in Linux? Grep? Pipe? WGET Etc
How to 'Follow' a File in Linux? Grep? Pipe? WGET Etc

Time:01-02

The requirement is essentially to monitor changes on /HelloWorld.txt, so that any new lines that are added can be 'picked up' and piped to something like a WGET, for that single new line.

I'm still in the early researching stages for how to chain these commands together (and ultimately turn into a Linux Daemon/Service/Systemctl 'service')

From what I know already and what I've been reading to fill in the gaps, it kind of feels like this should be a relatively simply one line command in a bash/shell script which does something like 'tail /HelloWorld.txt | wget example.com/HelloWorld'

I've yet to test any of this yet, so thought I'd post the question while I'm working away testing things in case I'm going in the wrong direction.

UPDATE

So looks like I am getting close to achieving a basic working example, but I'm seeing a lot of odd behaviour. Firstly, when running the command at the command line, I'm seeing the wrong data coming through.

Secondly, when I turn this into a Linux Service, I not only see the wrong data coming through, but also multiple times and lots of spawned processes loitering that needed cleaning up.

The command I'm running at the command line is;

tail -f -n 1 /myLog.txt | grep 'info_im_looking_for' --line-buffered | while read line; do xargs -d $'\n' -n 1 wget -q --delete-after --post-data "line=${line}" "https://www.example.com/HelloWorld"; done;

Seems I'm 95% of the way there, just struggling to get over the line. Feels as though if I can get this to work properly at the command line, I can figure out why the Linux Service isn't quite working properly.

What is odd is that when I just run the first bit of the command, this works fine and outputs to the console the info I would expect to see, it's the bits afterwards that don't seem to be quite working correctly;

tail -f -n 1 /myLog.txt | grep 'info_im_looking_for'

Note, my current thought process is that the parameter passing (which includes "s) is likely needing to be encoded, just in the process of taking a look at options for this.

Feels like this is probably going to be easier to achieve via a .sh or .py script rather than one magic single line piped command to get this working.

CodePudding user response:

 tail -f /HelloWorld.txt | while read line; do wget example.com/HelloWorld -O - >> /dev/null; done

The tail -f command will continuously monitor the file /HelloWorld.txt for new lines, and the while read line loop will iterate over each new line that appears. For each iteration, the wget command will be executed to send a request to example.com/HelloWorld. The -O - option for wget tells it to output the response to stdout, which is then redirected to /dev/null to discard the output.

CodePudding user response:

If you want to execute a POST HTML request for every line of input, I believe your xargs has been used incorrectly.

tail -f -n 1 /myLog.txt |
grep 'info_im_looking_for' --line-buffered |
while IFS= read -r line; do
      curl -XPOST --data-urlencode "line=${line}" "https://www.example.com/HelloWorld"
done
  • Related