Home > Enterprise >  generate a json within bitbucket pipeline
generate a json within bitbucket pipeline

Time:03-31

Running this dbt docs generatecommand generates a catalog.json file in the target folder. The process works well locally.

    feature/dbt-docs:
      - step:
          name: 'setup dbt and generate docs'
          image: fishtownanalytics/dbt:1.0.0
          script:
            - cd dbt_folder
            - dbt docs generate 
            - cp target/catalog.json ../catalog.json

After generating the catalog.json file, I want to upload it to s3 in the next step. I copy it from the target folder to the root folder and then I upload it somewhat like this:

      - step:
          name: 'Upload to S3'
          image: python:3.7.2
          script:
            - aws s3 cp catalog.json s3://testunzipping/ 

However, I get an error that:

  aws s3 cp catalog.json s3://testunzipping/
The user-provided path catalog.json does not exist.

Although the copy command works well locally, it seems to not generate the file properly within the bitbucket pipeline. Is there any other way that I can save the content of catalog.json in some variable in the first step and then later upload it to S3?

CodePudding user response:

In bitbucket pipelines, each step has its own build environment. To be able to share things between steps, you should use artifacts.

You may want to try the steps below.

feature/dbt-docs:
      - step:
          name: 'setup dbt and generate docs'
          image: fishtownanalytics/dbt:1.0.0
          script:
            - cd dbt_folder
            - dbt docs generate 
            - cp target/catalog.json ../catalog.json
          artifacts:
            - catalog.json
      - step:
          name: 'Upload to S3'
          image: python:3.7.2
          script:
            - aws s3 cp catalog.json s3://testunzipping/ 

Reference : https://support.atlassian.com/bitbucket-cloud/docs/use-artifacts-in-steps/

  • Related