Home > front end >  Keep postgres docker container in azure pipeline job running
Keep postgres docker container in azure pipeline job running

Time:05-18

I'm rather new to Azure and currently playing around with the pipelines. My goal is to run a postgres alpine docker container in the background, so I can perform tests through my python backend.

This is my pipeline config

trigger:
  - main
pool: 
  vmImage: ubuntu-latest
variables:
  POSTGRE_CONNECTION_STRING: postgresql psycopg2://postgres:passw0rd@localhost/postgres
resources:
  containers:
    - container: postgres
      image: postgres:13.6-alpine
      trigger: true
      env:
        POSTGRES_PASSWORD: passw0rd
      ports:
        - 1433:1433
      options: --name postgres
stages:
  - stage: QA
    jobs:
      - job: test
        services:
          postgres: postgres
        steps:
          - task: UsePythonVersion@0
            inputs:
              versionSpec: $(PYTHON_VERSION)
          - task: Cache@2
            inputs:
              key: '"$(PYTHON_VERSION)" | "$(Agent.OS)" | requirements.txt'
              path: $(PYTHON_VENV)
              cacheHitVar: 'PYTHON_CACHE_RESTORED'
          - task: CmdLine@2
            displayName: Wait for db to start
            inputs:
              script: |
                sleep 5
          - script: |
              python -m venv .venv
            displayName: create virtual environment
            condition: eq(variables.PYTHON_CACHE_RESTORED, 'false')
          - script: |
              source .venv/bin/activate
              python -m pip install --upgrade pip
              pip install -r requirements.txt
            displayName: pip install
            condition: eq(variables.PYTHON_CACHE_RESTORED, 'false')
          - script: |
              source .venv/bin/activate
              python -m pytest --junitxml=test-results.xml --cov=app --cov-report=xml tests
            displayName: run pytest
          - task: PublishTestResults@2
            condition: succeededOrFailed()
            inputs:
              testResultsFormat: 'JUnit'
              testResultsFiles: 'test-results.xml'
              testRunTitle: 'Publish FastAPI test results'
          - task: PublishCodeCoverageResults@1
            inputs:
              codeCoverageTool: 'Cobertura'
              summaryFileLocation: 'coverage.xml'  

But the pipeline always fails at the step "Initialize Containers", giving this error: Error response from daemon: Container <containerID> is not running as if it was just shutting down because there is nothing to do. Which seems right, but I don't know how to keep it running until my tests are done, the backend just runs pytest against the database. I also tried adding that resource as container using the container property, but then the pipeline crashes at the same step, saying that the container was just running less than a second.

I'm thankful for any ideas!

CodePudding user response:

I'm suspicious that your container is not stopping because of "there is nothing to do", the postgres image is configured in a way to act as a service. Your container is probably stopping because of an error.

I'm sure there is something to improve: you have to add the PGPORT env var to your container and set to 1433 because that port is not the default port for the postgres docker image, so opening that port on your container like you are doing with ports is not doing too much in this case.

Also, your trigger: true property would mean that you are expecting updates on the official DockerHub repository for postgres and in case of a new image release, run your pipeline. I think that does not makes too much sense, you should remove it, just in case, although this is marginal problem from the perspective of your question.

  • Related