Home > Back-end >  Running a script and get it alive after a SSH disconnection (like a broken pipe) without forcing it
Running a script and get it alive after a SSH disconnection (like a broken pipe) without forcing it

Time:12-21

If I got a bash script that it's something like this

#!/bin/bash -p

echo "PART A PID $$" >> "txt$$.txt";

for ((a=1; a<=10; a  ))
do
  sleep 1
  echo "PART B$a" >> "txt$$.txt";
done
                                                                                                                                       
echo "PART C" >> "txt$$.txt";

I know that ./script.sh & disown can prevent a script from being stopped when SSH exits, but I want to achieve that without needing to put the script in the background.

PART A
PART B1
PART B2
PART B3
PART B4
PART B5
PART B6
PART B7
PART B8
PART B9
PART B10
PART C

and getting an output like this.

I want to wait for the cycle to get an ordered output.

Note (i don't want nohup)

EDIT

I'm doing this because I want to start more than one of this.

If i do something like

#!/bin/bash -p

(
echo "PART A PID $$" >> "txt$$.txt";
for ((a=1; a<=10; a  ))
do
  sleep 1
  echo "PART B$a" >> "txt$$.txt";
done
echo "PART C" >> "txt$$.txt";
) & disown

the PID changes. In this way if I want to stop the process i can't get the original referrer of the pid.

I want to know if there's something simple to put at the bottom, without upset all the code

CodePudding user response:

As you've explained it in comments, your real goal is to let your script still run in the foreground by default, but make sure it survives SSH disconnecting.

You can do this without needing disown in any way, though the code needs to be put at the top, not the bottom:

#!/usr/bin/env bash

# redirect all three of stdin, stdout and stderr (not just stdout!)
exec >"txt$$.txt" 2>"err$$.txt" </dev/null

# ignore any SIGHUP signals received
trap : SIGHUP

# ...put the content of your script here
  • Related