Home > database >  How can the logging of all child processes in Perl's fork be controlled?
How can the logging of all child processes in Perl's fork be controlled?

Time:05-12

I want to control the logging of all child processes in the file.

Code Snippet (file1.pl):

my @sitesForScr = ("abc_10","def_5","ghi_16");
foreach my $siteToRunOn (@sitesRun) {
     my $jkpid;
     if ($jkpid = fork()) {
         $SIG{CHLD} = 'DEFAULT';
     }
     elsif (defined ($jkpid)) {
         &linkFunc ("$siteToRunOn");
         exit 0;
     }
 }


sub linkFunc {
    print "$_[0]\n";
    my @ert=split("_",$_[0]);
    print "Waiting on $_[0] for $ert[1] sec\n";
    sleep $ert[1];
    print "Done for $_[0]\n";
}

What I want is that first, the logging of the first child process completes, then the logging of the second child process starts and when it completes, then the logging of the next child process starts, and so on.

As per the above code, output inside file (fileoutput.txt) on running "perl file1.pl >> /pan/sedrt/fileoutput.txt" is:

abc_10
Waiting on abc_10 for 10 sec
def_5
Waiting on def_5 for 5 sec
ghi_16
Waiting on ghi_16 for 16 sec
Done for def_5
Done for abc_10
Done for ghi_16

Expected Output on running command "perl file1.pl >> /pan/sedrt/fileoutput.txt":

abc_10
Waiting on abc_10 for 10 sec
Done for abc_10
def_5
Waiting on def_5 for 5 sec
Done for abc_10
ghi_16
Waiting on ghi_16 for 16 sec
Done for ghi_16

How can this be done?

Thanks!

CodePudding user response:

If by "logging" you mean that they all print to console like in the given example, then you can't really have them decoupled since they all compete for a single resource (fd 1).

What you can do though, is to have each child assemble its log as it goes and in the end they all communicate them to the parent. Thus the integrity of those logs is preserved and the parent can then sort it out as needed.

Each process can write its log to a file, with a pre-determined name that the parent knows, or can pipe the name to the parent (if there is more to communicate anyway). Or, each can redirect its STDOUT to an in-memory variable, which it can then send over a pipe to the parent at the end. So there'll be some communication management involved.

Or, use a library -- for example, Parallel::ForkManager provides for easy communication from children back to the parent. And it makes the whole process easier as well.


Without communication between them that is, which would be extremely messy.

CodePudding user response:

  • You want the work to be performed simultaneously.
  • You want the output to appear grouped.

Yet, the processes are printing before doing the work (sleeping) and after. So the first thing that needs to be changed is that all of a process's output needs to be delayed until no more output will be produced by that process.

Whether you store the output in a memory, store it in a file, or pipe it back to the parent is up to you. Example of the first:

sub linkFunc {
    my $out = "$_[0]\n";
    my @ert=split("_",$_[0]);
    $out .= "Waiting on $_[0] for $ert[1] sec\n";
    sleep $ert[1];
    $out .= "Done for $_[0]\n";
    print $out;
}

That's not quite enough, though. You need to ensure that the print is not interrupted by the other processes. You will need to a mutex.

my $lock = acquire_lock();  # To be provided.

print $out;
select()->flush();

release_lock( $lock );      # To be provided.
  • Related