Home > other >  How to run API endpoints tests with Docker-Compose in Gitlab CI/CD pipeline
How to run API endpoints tests with Docker-Compose in Gitlab CI/CD pipeline

Time:11-14

I want to automate testing process for my simple API with Gitlab CI/CD pipeline and with docker-compose. I have tests that I want to run when the app container is build the question is that I cannot wait for app service before run tests on http://app:80 address.

Project structure:

project:
-- app
-- tests
-- docker-compose.yml
-- .gitlab-ci.yml

What I have:

docker-compose:

version: "3.0"

services:
  app:
    build:
      context: ./app
      dockerfile: Dockerfile
    ports:
      - "81:80"
    volumes:
      - ./app:/app/app

  tests:
    build:
      context: ./tests
      dockerfile: Dockerfile

  postgres:
    image: postgres:12-alpine
    ports:
      - "5432"
    environment:
      - POSTGRES_USER=${POSTGRES_USER}
      - POSTGRES_PASSWORD=${POSTGRES_PASS}
      - POSTGRES_DB=${POSTGRES_DB}
    volumes:
      - ./app/data:/var/lib/postgresql/data

tests/ dir with files:

def test_select():
    url = f"{HOST}/select/"
    response = requests.get(url)

    status_code = response.status_code
    result_len = len(response.json().get("result"))

    assert status_code == 200
    assert result_len != 0

.gitlab-ci.yml:

stages:
  - build

build:
  stage: build
  script:
     - sudo mkdir -p /home/app/
     - sudo cp -r $PWD/* /home/app/
     - cd /home/app/
     - docker-compose up -d

The end goal is to ran the tests before docker-compose build is finished and if some test fails than docker-compose will fail and the pipeline too.

Is this possible and if there is another way to resolve this I will be very grateful.

CodePudding user response:

Usually you do not start docker-compose.yml from within the pipeline. Your docker-compose.yml is useful for local development, but in the pipeline you have to use a different approach, using GitLab services: https://docs.gitlab.com/ee/ci/services/

But if you want to E2E or load test your API from a GitLab pipeline you can use services for it to expose for example the postgress database:

test:e2e:
   image: ubuntu:20.04
   stage: test
   services:
      - name: postgres:12-alpine
        alias: postgress
   script:
      - curl http://postgress:5432 # should work!

Next steps are to start your api in detached mode. For example:

script:
   - python my-app.py &
   - sleep 30
   - # your app should be app now and should be exposed on let's say localhost:81 according your specs. You can safely run your API tests here

Note python will not be out-of-the-box available. For that you have to either install it in the pipeline or create a docker image that you use in the pipeline. Personally I always use a custom docker image within GitLab pipelines to prevent Docker rate limits. I have an example of a personal project to create custom images and store them in GitLab.

CodePudding user response:

There are a few solutions for this with varying levels of sophistication:

  1. Add a long enough wait to the start of your container
  2. Add retry logic (ideally with backoff) to your code running inside the contianer
  3. Depend on an intermediate container that whose logic is responsible for ensuring the other dependency is fully available and functional.

Though, I think your issue is that you're simply missing a depends_on declaration in your docker-compose. Also be sure your app image has proper EXPOSE declarations or add them in the compose file.

Also, since you're running your test inside the docker network, you don't need the port mapping. You can contact the service directly on its exposed port.

  app:
    ports:
      - "80"
  tests:
    depends_on: # IMPORTANT! Waits for app to be available
      - app  # makes sure you can talk to app on the network
  # ...

Then your tests should be able to reach http://app

As a complete example using public projects:

version: "3"
services:
  app:
    image: strm/helloworld-http
    ports:
      - "80"  # not strictly needed since this image has EXPOSE 80
  tests:
    depends_on:
      - app
    image: curlimages/curl
    command: "curl http://app"

If you ran docker-compose up you'd see the following output:

Creating testproj_app_1 ... done
Creating testproj_tests_1 ... done
Attaching to testproj_app_1, testproj_tests_1
app_1    | 172.18.0.3 - - [12/Nov/2021 03:03:46] "GET / HTTP/1.1" 200 -
tests_1  |   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
tests_1  |                                  Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0<html><head><title>HTTP Hello World</title></head><body><h1>Hello from d8
f6894ccd1e</h1></body></html
100   102  100   102    0     0   4706      0 --:--:-- --:--:-- --:--:--  4857
testproj_tests_1 exited with code 0

You could also opt to use gitlab's services: however, if you already have a workflow testing locally using docker-compose, then this is less ideal because then you have a different way of testing locally vs. in GitLab and then the test method is not portable to other CI systems or for other developer's local environments.

  • Related