Home > database >  GitHub Actions: How to pass toJSON() result to shell commands
GitHub Actions: How to pass toJSON() result to shell commands

Time:07-13

So, I'm working with Github Actions on end-to-end testing. The setup I'm looking at is having one job retrieve a list of urls to be tested, and my second job creates a matrix with that list and tests them all. My problem here is that when I actually run my testing script, it has to be done from the command line, because I'm using Playwright. Therefore I can't use my matrix object directly; I have to output it to a JSON file. The problem is that toJSON creates invalid pretty-printed JSON when I output it to my file, which breaks my script. Here's my code:

name: <name>

on:
    push:
    workflow_dispatch:
    #on timer

jobs:
    fetch_strategic_urls:
        runs-on: ubuntu-latest

        outputs:
            urls: ${{ steps.req-urls.outputs.urls }}

        steps:
            - name: Request Urls
              id: req-urls
              run: |
                  export RESPONSE=$(curl -X GET -H "Accept: application/json" <api-endpoint>)
                  echo "::set-output name=urls::$RESPONSE"

    run_tests:
        runs-on: ubuntu-latest

        strategy:
            matrix:
                url: ${{needs.fetch_strategic_urls.outputs.urls}}
        needs: fetch_strategic_urls

        env:
            BRANCH_NAME: ${{ github.head_ref || github.ref_name }}

        steps:
            - name: Checkout only E2E.Tests folder from branch
              run: |
                  REPO="https://${GITHUB_ACTOR}:${{ secrets.GITHUB_TOKEN }}@github.com/${GITHUB_REPOSITORY}.git"
                  git clone --single-branch -b ${{ env.BRANCH_NAME }} --filter=blob:none --no-checkout --depth 1  --sparse $REPO . 
                  git sparse-checkout init --cone
                  git sparse-checkout add "E2E.Tests"
                  git checkout

            - run: ls
            - uses: actions/setup-node@v3
              id: setup_node_id
              with:
                  node-version: 18
                  cache: 'npm'

            - uses: actions/cache@v3
              id: npm-cache-id
              with:
                  path: E2E.Tests/node_modules
                  key: npm-cache

            - name: Install node modules if no cache
              if: steps.npm-cache-id.outputs.cache-hit != 'true'
              working-directory: E2E.Tests
              run: npm install

            - uses: actions/cache@v3
              id: playwright-cache-id
              with:
                  path: ~/.cache/ms-playwright
                  key: playwright-deps-cache

            - name: Install playwright if no cache
              if: steps.playwright-cache-id.outputs.cache-hit != 'true'
              run: npx playwright install --with-deps
              working-directory: E2E.Tests

            - run: |
                  ls
                  echo '${{ toJSON(matrix.url) }}' >> props.json
                  cat props.json
                  npm test
              working-directory: E2E.Tests

            - name: Upload test results
              if: always()
              uses: actions/upload-artifact@v2
              with:
                  name: test-results
                  path: E2E.Tests/traces/

No matter which configuration of echo ${{matrix.url}} >> props.json I've tried (cat <<'EOF' > props.json ${{matrix.url}}, adding and removing quotes), it always produced JSON files that have no quotes, i.e.: { url: string } instead of {"url": "string"}, which is invalid. This is obviously pretty breaking behavior. I've seen a lot of people online recommending jq, but I don't see how I would use it in this case, since I doubt jq can parse a GitHub-type JSON object, which is necessary for me to use when sharding my jobs. Any help is greatly appreciated!

CodePudding user response:

It's not easy to put a JSON doc directly in the command line. You can use env vars.

- shell: bash
  env:
    JSON_DOC: ${{ toJSON(some.var) }}
  run: |
    printf '%s\n' "$JSON_DOC" > foo.json
    cat foo.json
    ...
  • Related