Home > other >  Bash script checkpoints
Bash script checkpoints

Time:12-11

I am developing a big script which skeleton, looks like below:

#!/bin/bash

load_variables()

function_1()
function_2()
function_3()
[...]
function_n()
  1. During each take-off, user flags are first loaded in load_variables() function.
  2. Then script continue execution function_1() => function_2() => [...] => function_n()

I need to implement checkpoints which will be stored in log.txt.
Let's say, that script has been stoped or crashed at the function_2().
I want to save progress before each function start, store it in the log.txt and when I re-run the script again, I want to load_variables() and then jump to the crash point/checkpoint stored in log.txt.

How can I achieve that using bash?

CodePudding user response:

I want to save progress before each function start, store it in the log.txt and when I re-run the script again, I want to load_variables() and then jump to the crash point/checkpoint stored in log.txt.

Do exactly that. But you can't "jump" in bash scripts - instead of "jump", just skip already run functions, that you can track. Basically, in pseudocode:

load_variables() {
    if [[ -e log.txt ]]; then
        . log.txt
    fi
}

already_run_functions=()
checkpoint() {
   # if the function was already run
   if "$1" in already_run_functions[@]; then
        # skip this function
        return
   fi
   {
       # save state with function name
       declare -f
       declare -p
   } > log.txt
   # run it
   if ! "$@"; then
      # handle errors
      exit 1
   fi
   already_run_functions =("$1")
}

load_variables
checkpoint function1
checkpoint function2
checkpoint function3

Overall, it's shell, it is simple. It's way better to use a build system. Simple Make is more than enough to track dependencies of multiple shell scripts and parallelize the work. Store the result in files after each task and distribute functions to multiple files.


So some real example:

rm -f log.txt

script() {

load_variables="
    if [[ -e log.txt ]]; then
        . log.txt
        cd \"\$PWD\"
    else
       already_run_functions=()
    fi
"


oneof() {
  local i
  for i in "${@:2}"; do
   if [[ "$1" = "$i" ]]; then
       return 0
   fi
  done
  return 1
}
checkpoint() {
   # if the function was already run
   if oneof "$1" "${already_run_functions[@]}"; then
        # skip this function
        return
   fi
   {
       # save state
       declare -f
       declare -p $(compgen -v | grep -Ev '^(BASH.*|EUID|FUNCNAME|GROUPS|PPID|SHELLOPTS|UID|SHELL|SHLVL|USER|TERM|RANDOM|PIPESTATUS|LINENO|COLUMN|LC_.*|LANG)$')
   } > log.txt
   # run it
   if ! "$@"; then
      # handle errors
     echo "checkpoint: $1 failed"
      exit 1
   fi
   already_run_functions =("$1")
}

function1() {
 echo function1
}

function2() {
 echo function2
 if [[ -e file ]]; then
    return 1
 fi
}

function3() { 
  echo function3
}

eval "$load_variables"
checkpoint function1
checkpoint function2
checkpoint function3

}

touch file
( script )
rm file
( script )

outputs:

function1
function2
checkpoint: function2 failed
function2
function3

CodePudding user response:

This is an example with trap. Saving function names on error with trap and $LINENO

Logfile will be removed if everything is ok

#!/bin/bash

trap 'clear_log' EXIT
trap 'log_checkpoint $LINENO' ERR

CHECKLOG=checkpoints.log

clear_log() {
    if [ $? -eq 0 ] ; then
        if [ -f "$CHECKLOG" ]; then
            rm "$CHECKLOG"
        fi
    fi
}

log_checkpoint() {
  func=$(sed -n $1p $0)
  echo "Error on line $1: $func"
  echo "$func" > $CHECKLOG
  exit 1
}

retry(){
  [ ! -f $CHECKLOG ] && return 0
  if grep -q "$1" "$CHECKLOG"; then
          echo retry "$1"; rm "$CHECKLOG"; return 0
  else
          echo skip "$1"; return 1
  fi
}

func1(){
  retry ${FUNCNAME[0]} || return 0
  echo hello | grep hello
}

func2(){
  retry ${FUNCNAME[0]} || return 0
  echo hello |grep hello
}

func3(){
  retry ${FUNCNAME[0]} || return 0
  echo hello | grep foo
}

func1
func2
func3

exit 0
  • Related