I'm using an API with curl
that provides the date information about the fields in UNIX Epoch time instead of ISO8601, which makes it difficult to understand what's going on.
Sample Input:
{"message":"Domains list","list":[{"domain":"example.org","created":"1443042000","regtill":"1632430800"}]}
Is there a way to find in this stream of data anything that looks like a UNIX Epoch time (e.g., representing the recent times -- any 10-digit number in quotes past 1000000000
(which is 2001-09-09
as per env TZ=GMT-3 date -r1000000000 %Y-%m-%dT%H%M%S%z
with BSD date(1)
)), and convert it all to ISO8601-like dates, with a small shell script snippet in perl
or BSD awk
?
Desired Output (with or without timezone offsets):
{"message":"Domains list","list":[
{"domain":"example.org","created":"2015-09-24T000000 0300","regtill":"2021-09-24T000000 0300"}]}
CodePudding user response:
You could use jq, a great tool for manipulating JSON.
jq '
( .. | select(type == "string") | select(try tonumber | . > 1000000000) ) |=
( tonumber | todateiso8601 )
'
CodePudding user response:
Perl script that reads your JSON on standard input so it can have curl
output piped to it:
#!/usr/bin/env perl
use strict;
use warnings;
use feature qw/say/;
use experimental qw/postderef/;
use Time::Piece;
# Install through your OS package manager if provided or favorite
# CPAN client. Or use a different JSON module you like better; perl
# has quite a few available. Worst case if you can't install any
# extra modules is to use core JSON::PP
use JSON::MaybeXS;
my $json = JSON::MaybeXS->new->utf8;
my $rawdata = do { local $/; <STDIN> };
my $data = $json->decode($rawdata);
for my $domain ($data->{list}->@*) {
$domain->{created} = localtime($domain->{created})->datetime;
$domain->{regtill} = localtime($domain->{regtill})->datetime;
}
say $json->encode($data);
Example:
$ ./convtimes < input.json
{"message":"Domains list","list":[{"created":"2015-09-23T14:00:00","regtill":"2021-09-23T14:00:00","domain":"example.org"}]}
CodePudding user response:
This awk
script is not great, but it does produce the expected output (maybe with a few flaws?).
#!/usr/bin/env awk -f
BEGIN {
FS=":|,"; OFS=":"; RS="["
} NR==1 {
$NF="["
} NR>1 {
gsub(/"/,"");
$4=strftime("%Y-%m-%dT%H%M%S%z",$4);
$6=strftime("%Y-%m-%dT%H%M%S%z}]}",$6);
gsub(/:/,"\":\"");
sub(/{/,"{\"");
sub(/}/,"}\"")
} 1
Or as a one liner
awk 'BEGIN {FS=":|,"; OFS=":"; RS="["} NR==1 {$NF="["} NR>1 {gsub(/"/,""); $4=strftime("%Y-%m-%dT%H%M%S%z",$4);$6=strftime("%Y-%m-%dT%H%M%S%z}]}",$6); gsub(/:/,"\":\""); sub(/{/,"{\""); sub(/}/,"}\"")}1' input_file
Output
{"message":"Domains list":"list":[
{"domain":"example.org":"created":"2015-09-23T220000 0100":"regtill":"2021-09-23T220000 0100}"]}
CodePudding user response:
The best snippet so far has been provided by simbabque in a comment:
$ perl -MTime::Piece -pe 's/(\d{10,})/{localtime($1)->datetime}/ge' <<<'{"message":"Domains list","list":[{"domain":"example.org","created":"1443042000","regtill":"1632430800"}]}'
- you can pipe your curl through this. – simbabque 1 hour ago
I've further adapted it as follows:
- modified the regular expression to use positive lookbehind and lookahead, for
:"
and"
, with(?<=:")
and(?=")
, respectively, to avoid false-positives; - restrict UNIX Epoch time to exactly 10-digits, which should cover date periods between 2001 and 2286:
env TZ=GMT perl -MTime::Piece -e 'print localtime(1000000000)->datetime, "Z/", localtime(9999999999)->datetime, "Z\n"'
2001-09-09T01:46:40Z/2286-11-20T17:46:39Z
- use a few extra regular expressions to insert line breaks for each entry of the domain array;
The simplest snippet:
curl ... \
| perl -MTime::Piece -pe's#(\d{10})#localtime($1)->datetime#ge'
Add lookbehind/lookahead for quotes:
curl ... \
| perl -MTime::Piece -pe's#(?<=")(\d{10})(?=")#localtime($1)->datetime#ge'
Specify the timezone of the input data, and also insert linebreaks for each domain in the list, to make it more user-readable:
curl ... \
| env TZ=GMT-3 perl -MTime::Piece \
-pe 's#:\[{#:[\n{#g;s#},{#},\n{#g;s#(?<=:")(\d{10})(?=")#localtime($1)->datetime#ge'
Sample test run of the final solution for the overall transformation -- works on any stream of data, ignores anything that's not a JSON, would never give any errors on invalid JSON:
% printf '{"list":[{"d":"abc","c":"1443042000"},{"d":"xyz","c":"1000000000"}]}\n' \
| env TZ=GMT-3 perl -MTime::Piece -pe \
's#:\[{#:[\n{#g;s#},{#},\n{#g;s#(?<=:")(\d{10})(?=")#localtime($1)->datetime#ge'
{"list":[
{"d":"abc","c":"2015-09-24T00:00:00"},
{"d":"xyz","c":"2001-09-09T04:46:40"}]}
%
This solution is more flexible than the other solutions that presume that the input is a valid JSON, because it can also be used in instances where a single curl
command is used to make more than a single request, which wouldn't be a valid JSON, since JSON can only have one root element. It's also more flexible because it doesn't even assume that the input data is in any specific format -- anything with the 10-digit numbers between :"
and "
will be automatically converted from UNIX Epoch time to ISO8601 exactly as specified in the question.
CodePudding user response:
Use the date command. This command:
date -d '@1443042000'
Produces this answer:
Wed 23 Sep 2015 03:00:00 PM MDT