I have my app deployed to Kubernetes and it's producing some logs. I can see the logs by running kubectl logs -f <pod-id> -n staging
, but I can't find where the logs are physically located on the pod. The /var/log/
folder is empty, and I can't find the logs anywhere else on the pod either.
Why is this happening, and where should the logs be?
CodePudding user response:
As @ Achraf Bentabib said
Kubernetes creates a directory structure to help you find logs based on Pods, so you can find the container logs for each Pod running on a node at
/var/log/pods/<namespace>_<pod_name>_<pod_id>/<container_name>/
Identify the node on which the Pod is running:
kubectl get pod pod-name -owide
SSH on that node, you can check which logging driver is being used by the node with:
If you are using docker then:
docker info | grep -i logging
If you are using kubernetes:
kubectl ssh node NODE_NAME
If the logging driver writes to file, you can check the current output for a specific Pod by knowing the container id of that Pod, to do so, on a control-plane node
kubectl get pod pod-name ojsonpath='{.status.containerStatuses[0].containerID}'
Example:
var/log/containers/<pod-name>_<namespace>_<container-name-container-id>.log -> /var/log/pods/<some-uuid>/<container-name>_0.log