I've tried to create a single line command to convert a particular column in a CSV from EPOCH to human readable. The file contains various columns, depending on the particular file. E.g. in this file EPOCH is in column 3 of 3, in another file it is in column 5 of 9.
[user@Dreadnaught log]$ cat FailedApps.log
Config, Unable to open file /var/config/snapshot.xml, 1658872997
Config, Unable to open file /var/config/snapshot.xml, 1661421388
Config, Unable to open file /var/config/snapshot.xml, 1661421414
The basic conversion works when done manually, so I know the values are valid EPOCH dates:
[user@Dreadnaught log]$ date -d@1658872997
Tue Jul 26 22:03:17 UTC 2022
But I'm having trouble passing item 3 to the date function correctly when I parse the file - it appears the EPOCH date gets passed with a space in front of it, which breaks the date command. Here's my attempt, learnt from other posts on this forum:
[user@Dreadnaught log]$ cat FailedApps.log | awk -F, '{ OFS = FS; command="date -d@\""$3"\""; command |getline; close(command); print}'
date: invalid date '@ 1658872997'
Config, Unable to open file /var/config/snapshot.xml, 1658872997
date: invalid date '@ 1661421388'
Config, Unable to open file /var/config/snapshot.xml, 1661421388
date: invalid date '@ 1661421414'
Config, Unable to open file /var/config/snapshot.xml, 1661421414
Please let me know how to correctly pass $3 to the date function? I tried a sed but that made it worse.
Note: strftime
is not available in this build, and it's an embedded machine so I can't add it.
CodePudding user response:
Since you don't have GNU awk, there's no point involving awk at all, just use shell:
$ while IFS=',' read -r a b c; do
printf "%s,%s,%s\n" "$a" "$b" " $(date -d@"${c# }")"
done < FailedApps.log
Config, Unable to open file /var/config/snapshot.xml, Tue Jul 26 17:03:17 CDT 2022
Config, Unable to open file /var/config/snapshot.xml, Thu Aug 25 04:56:28 CDT 2022
Config, Unable to open file /var/config/snapshot.xml, Thu Aug 25 04:56:54 CDT 2022
With the above we just have shell
calling date
, i.e. shell { date }
, whereas if you involved awk
it'd need to spawn a subshell to call date
once per input line, and you'd have shell { awk { system { subshell { date } } } }
which would be needlessly inefficient in execution time and resources.
With your other input file where the epoch value is in field 5 of 9 you'd change the above to:
$ while IFS=',' read -r a b c d e rest; do
printf "%s,%s,%s,%s,%s,%s\n" "$a" "$b" "$c" "$d" " $(date -d@"${e# }")" "$rest"
done < FailedApps.log
CodePudding user response:
Your fields are separated by ,
rather than ,
inform your awk
about that inside BEGIN
and then date
should start working:
awk 'BEGIN{FS=OFS=", "}{command="date -d@\""$3"\""; command |getline; close(command); print}' FailedApps.log
Observe that also you do not have to use cat
as awk
can read files on its' own.
That is reading the 3rd column and converting it correctly, but it's not printing the first two columns.
You might use separate printf
for that as follows
awk 'BEGIN{FS=OFS=", "}{printf "%s%s%s%s",$1,OFS,$2,OFS}{command="date -d@\""$3"\""; command |getline; close(command); print}' FailedApps.log
Be warned that above code assumes input file with 3rd column, in order to make it work with variable number of fields you might use for
loop which will printf
each but last field followed by OFS