Home > database >  How to read stdout from a sub process in bash in real time
How to read stdout from a sub process in bash in real time

Time:06-17

I have a simple C program that counts from 0 to 10 with an increment every 1 second. When the value is incremented, it is written to stdout. This program intentionally uses printf rather than std::cout. I want to call this program from a bash script, and perform some function (eg echo) on the value when it is written to stdout. However, my script waits for the program to terminate, and then process all the values at the same time.

C prog:

#include <stdio.h>
#include <unistd.h>

int main(int argc, char **argv)
{
        int ctr = 0;

        for (int i = 0; i < 10;   i)
        {
                printf("%i\n", ctr  );
                sleep(1);
        }

        return 0;
}

Bash script:

#!/bin/bash

for c in $(./script-test)
do
    echo $c
done

Is there another way to read the output of my program, that will access it in real time, rather than wait for for the process to terminate. Note: the C program is a demo sample - the actual program I am using also uses printf, but I am not able to make changes to this code, hence the solution needs to be in the bash script.

Many thanks, Stuart

CodePudding user response:

As you correctly observed, $(command) waits for the entire output of command, splits that output, and only after that, the for loop starts.

To read output as soon at is available, use while read:

./script-test | while IFS= read -r line; do
   echo "do stuff with $line"
done

or, if you need to access variables from inside the loop afterwards, and your system supports <()

while IFS= read -r line; do
   echo "do stuff with $line"
done < <(./script-test)
# do more stuff, that depends on variables set inside the loop

CodePudding user response:

You might be more lucky using a pipe:

#!/bin/bash

./script-test | while read c; do
    echo $c
done
  • Related