Home > front end >  2 Linux scripts nearly identical. Variables getting confused between to different scripts
2 Linux scripts nearly identical. Variables getting confused between to different scripts

Time:11-30

I have two scripts. The only difference between the two scripts is the log file name and the device ip address that it fetches the data from. The problem is that the log file that concats continuously mixes up and starts writing the contents of one device onto the log of the other. So, 1 particular log file randomly switches from showing the data from one device to the other device..

Here is a sample of what it gets from the curl call.

{"method":"uploadsn","mac":"04786364933C","version":"1.35","server":"HT","SN":"267074DE","Data":[7.2]}

I'm 99% the issue is with the log variable, as one script runs every 30 minutes and one script runs every 15 minutes, so i can tell by the date stamps that the issue is not from fetching from the wrong device, but the concatenating of the files. It appears to concat the wrong file to the new file....

Here is the code of both.

#!/bin/bash

log="/scripts/cellar.log"

if [ ! -f "$log" ]
then
  touch "$log"
fi

now=`date  %a,%m/%d/%Y@%I:%M%p`

json=$(curl -m 3 --user *****:***** "http://192.168.1.146/monitorjson" --silent --stderr -)
celsius=$(echo $json | cut -d "[" -f2 | cut -d "]" -f1)
temp=$(echo "scale=4; $celsius*1.8   32" | bc)
line=$(echo $now : $temp)

echo $line
echo $line | cat - $log > temp && mv temp $log | sed -n '1,192p' $log

and here is the second

#!/bin/bash

log="/scripts/gh.log"

if [ ! -f "$log" ]
then
  touch "$log"
fi

now=`date  %a,%m/%d/%Y@%I:%M%p`

json=$(curl -m 3 --user *****:***** "http://192.168.1.145/monitorjson" --silent --stderr -)
celsius=$(echo $json | cut -d "[" -f2 | cut -d "]" -f1)
temp=$(echo "scale=4; $celsius*1.8   32" | bc)
line=$(echo $now : $temp)

#echo $line
echo $line | cat - $log > temp && mv temp $log | sed -n '1,192p' $log

Example of bad log file (shows contents of both devices when should only contain 1):

Mon,11/28/2022@03:30AM : 44.96
Mon,11/28/2022@03:00AM : 44.96
Mon,11/28/2022@02:30AM : 44.96
Tue,11/29/2022@02:15AM : 60.62
Tue,11/29/2022@02:00AM : 60.98
Tue,11/29/2022@01:45AM : 60.98

CodePudding user response:

The problem is that you use "temp" as the filename for a temporary file in both scripts.

I'm not good in understanding sed, but as I read it, you print only the first 192 lines of the logfile with your command. You don't need a temporary file for that.

First: logfiles are usually written from oldest to newest entry (top to bottom), so probably you want to view the 192 newest lines? Then you can make use of the >> output redirection to append your output to the file. Then use tail to get only the bottom of the file. And if necessary, you could reverse that final output.

That last line of your script would then be replaced by:

sed -i '1i '"$line"'
192,$d' $log

Further possible improvements:

  • Use a single script that gets URL and log filename as parameters
  • Use the usual log file order (newest entries appended at the end)
  • Don't truncate log files inside the script, but use logrotate to not exceed a certain filesize
  • Related