Normally, in docker/k8s, it's recommended that directly ouput the logs to the stdout.
Then we can use kubectl logs
or docker logs
to see the logs.
like: time=123 action=write msg=hello world
, as a tty, it might be colorized
for human friendliness.
However, if we want to export the logs to a log processing center, like EFK(elasticsearch-fluentd-kibana), we need a json-format
log file.
like: {"time"=123,"action"="write","msg"="hello world"}
What I want?
Is there a log method that can take into account both human friendliness
and json format
?
CodePudding user response:
Kubernetes has such option of structured logging for its system components.
Klog library allows to use --logging-format=json
flag that enables to change the format of logs to JSON output - more information about it here and here.
CodePudding user response:
yes you can do that with Flunetd, below are the basic action items that you need to take to finalize this setup
- Configure docker container to log to stdout (you can use any format you like)
- Configure FluentD to tail docker files from
/var/lib/docker/containers/*/*-json.log
- parse logs with Flunetd and change the format to JSON logs
- output the logs to Elasticsearch.
This article shows exactly how to do this setup also this one explain how to parse Key-Value logs