I have the following gitlab-ci.yaml
image: docker:19.03.13
variables:
DOCKER_TLS_CERTDIR: "/certs"
services:
- docker:19.03.13-dind
build-django_and_fastapi:
before_script:
- echo "$DOCKER_REGISTRY_PASS" | docker login $DOCKER_REGISTRY --username $DOCKER_REGISTRY_USER --password-stdin
stage: build
script:
- mkdir test
- cd test
- git clone https://gitlab-ci-token:${CI_JOB_TOKEN}@gitlab.com/xxxx/yyyy.git
- cd ..
- docker build ./test
I got /bin/sh: eval: line xxx: git: not found
How add git package to docker:19.03.13
image
CodePudding user response:
To sumnmarize out of the comments: the goal is to build a docker image out of the content of two other repositories. Therefore we want to use git and docker at the same build stage - or at least this is the attempted try. I provide here different options, which can be used to achieve this.
Option 1: Migrating fetching logic into Dockerfile
Instead of messing around with a build image, i would migrate this logic into my Dockerfile. Normally i have easier ways of handling such things within the Dockerfile, and even if it adds another layer for git and for the removal - i am faster than trying to mess around with a build image, containing docker and git.
But it depends on your Dockerfile and the Docker base image you are using, with debian/alpine/etc this is quiet easy to achieve, they have their own package managers on board.
Option 2: Building a docker image for building containing docker and git
This option is the one i least favor. There is always an issue with properly setting up docker, or installing git additionally. But i will outline the process here too.
What you need in this case is an own Docker image where you either:
- pick the docker image and install git
- pick an git image and install docker
- build a fresh image from the ground
- (you can always try to figure out which package manager is used, and install it in the script block)
but it adds complexity and is more effort than doing Option 1. and offers lesser safety than Option 3.
Option 3: Use API instead of GIT (my recommended way)
Instead of using git for fetching the content, there is also the API https://docs.gitlab.com/ee/api/repositories.html#get-file-archive
Which allows you to download a special ref as a zip/tar etc. which can be easier used than a git checkout. this can also be combined with option 1. as it allows easy fetching of the content via curl.
This option has also the benefit, of not loading the git history, just the current state. Which might improve build times.
Option 4: multiple build steps
Instead of trying to merge the docker build and the git checkout, you can split both into two jobs. first one with git, fetching the repositories and one for the docker build.
Important to note here is the artifacts
directive, with which you can define which files are available at the next stage/build. Take a look at https://docs.gitlab.com/ee/ci/yaml/#artifacts which is a good resource regarding that directive.
Option 5: Using git Submodules
Instead of doing the checkout manually, the other repositories could also be added as git submodules. Which can be seen as sub directories which point to other git repositories. There is a special checkout behaviour attached, but with a closer look into submodules, you should figure this out quiet easily.
Be aware to also set the GIT_SUBMODULE_STRATEGY
so those will be fetched. https://docs.gitlab.com/ee/ci/runners/configure_runners.html#git-submodule-strategy
CodePudding user response:
I had to install git inside the docker image
image: docker:19.03.13
variables:
DOCKER_TLS_CERTDIR: "/certs"
services:
- docker:19.03.13-dind
build-django_and_fastapi:
stage: build
script:
- docker login -u gitlab-ci-token -p $CI_JOB_TOKEN registry.gitlab.com
- apk update
- apk add git
- mkdir test
- cd test
- git clone https://gitlab-ci-token:${CI_JOB_TOKEN}@gitlab.com/xxxx/yyyy.git
- cd ..
- docker build ./test
So in the script i have added apk add git
Regarding the approach of making a image from src code of multiple repos
I prefer to prepare the full context folder for the DockerFile and then build it.
So in the script
I do
script
- make folders for src codes
- clone the src codes into those folders
- build the image using docker build
The reason i do this, we can take advantage of the cache while image building. So if 10 steps go well, then next time i build the image it uses the cached layers and start from the 11th step.
Because image building will need some editing the Dockerfile till we get the image right. So caching of the layers is very helpful.
If i try to git clone inside dockerfile, it may not take advantage of the cache.
Ofcouse on local pc i can use the cache mechanism, but in gitlab docker i am not sure how can i use it.