I am working on a script aggregating all job traces from pipeline’s jobs. My goal is to:
- Send traces to Graylog server
- Save job traces locally to make them accessible from the machine in case of Graylog shutdown.
My first thought was accessing the logs from my GitLab CI using using docker logs
(or some other cli tool) on my machine with docker.
I know from this thread that it's possible to do from docker containers using for example:
echo "My output" >> /proc/1/fd/1
But is that possible to do from Gitlab-runner containers? My .gitlab-ci.yml for testing looks like this:
image: python:latest
stages:
- test
test:
stage: test
tags:
- test
script:
- echo "My output" >> /proc/1/fd/1
Generally I would like to be able to get "My output" from machine using docker logs
command but I am not sure how to do this. I use docker executor for my Gitlab runner.
I hope my explanation is understandable.
CodePudding user response:
You cannot do this with any of the official docker-based GitLab executors. Job output logs are not emitted from the runner or containers it starts. All output from a job container is captured and transmitted to the GitLab server in realtime. The output never reaches the docker logging driver. Therefore, you cannot use docker logs
or similar utilities to obtain job logs.
You can obtain job logs either: (1) from the configured storage of the GitLab server or (2) by using the jobs API.
For example, you can run a log forwarder (like splunk universal forwarder, graylogs forwarder, etc.) directly on a self-hosted GitLab instance to forward job traces to respective external systems.