Home > front end >  Commit list from GIT is always processed completely per loop instead of individual elements
Commit list from GIT is always processed completely per loop instead of individual elements

Time:01-29

I have a problem with my bash script. Namely, I am doing procedures against an API via CURL for each JSON file pushed to GIT. In principle the procedure works, however I have the problem that for processing in each loop the complete commit list is controlled against the API.

So if I push two JSON files to GIT, then each JSON file is executed twice against the API instead of each JSON file only once.

Example:

Git Push: file1.json file2.json file3.json file4.json file5.json

Execution: file1.json file1.json file1.json file1.json file1.json file2.json file2.json file2.json file2.json file2.json file3.json file3.json file3.json file3.json file3.json file4.json file4.json file4.json file4.json file4.json file5.json file5.json file5.json file5.json file5.json

Expectation would be each file only once.

I tried to solve the issue using arrays, but apparently it doesn't work as thought.

Here is the actual function from the code:

#!/bin/bash

# Create an empty array to store processed files
processed_files=()

#Login
endpoint=xxx
username=xxx
password=xxx

# Get list of files in commit
files=`git diff --name-only HEAD HEAD~1`

# Test each file that is a json
for file in $files
do
    if [[ $file == *.json ]] 
    then
        # Check if the file has already been processed
        if [[ ! " ${processed_files[@]} " =~ " ${file} " ]] 
        then
            # Add the file to the array
            processed_files =("$file")
            echo "Jobexecution"
            curl -k -s -H "Authorization: xxx" -X POST -F "definitionsFile=@../$file" -F "$endpoint/deploy"
            submit=$(curl -k -s -H "Authorization: xxx" -X POST -F "jobDefinitionsFile=@../$file" -F "$endpoint/run")
            runid=$(echo ${submit##*runId\" : \"} | cut -d '"' -f 1)
            # Check job status
            jobstatus=$(curl -k -s -H "Authorization: xxx" "$endpoint/run/status/$runid")
            status=$(echo ${jobstatus##*status\" : \"} | cut -d '"' -f 1)
            # Wait till jobs ended
            echo "Wait till jobs ended"
            until [[ $status == Ended* ]]; do
                sleep 10
                tmp=$(curl -k -s -H "Authorization: xxx" "$endpoint/run/status/$runid")
                echo $tmp | grep 'Not OK' >/dev/null && exit 2
                tmp2=$(echo ${tmp##*$'\"type\" : \"Folder\",\\n'})
                status=$(echo ${tmp2##*\"status\" : \"} | cut -d '"' -f 1)
            done
        else
            echo "Job was already executed. Ill skip this one."
        fi
    fi
done

# Logout
curl -k -s -H "Authorization: xxx" -X POST "$endpoint/session/logout"

# Exit
if [[ $status == *Not* ]]; then
    echo 'Job failed!'
    exit 1
else
    echo 'Success!'
    exit 0
fi

As already mentioned, I tried to solve the issue using arrays, but apparently it doesn't work as thought.

CodePudding user response:

I solve the problem. The issue was, that the Jenkins Pipeline send always the whole commit list for each JSON-File.

To solve the problem, I execute the Bash-Script in the Jenkinspipeline with an argument, wich is the actually JSON-File in the loop.

  • Related