I have a test and a production environment. I use GitHub branches for deployments. Artifacts on the development branch should be deployed to the development environment and production artefacts on production.
One way to achieve this is to have a different azure-pipeline.yml file on develop and master.
- task: AzureFileCopy@4
displayName: 'copy spark jobs'
inputs:
sourcePath: $(workingDirectory)/transformed
azureSubscription: developSubscription
destination: azureBlob
storage: developStorageAccount
containerName: sourcecode
blobPrefix: myBlob
resourceGroup: developmentSourceGroup
I would replace on the master branch the identifiers for azureSubscription, resourceGroup, and storage on prod. But this would also mean that with every merge, I risk overwriting the production settings with the development settings.
So I wonder if there is a way to define which setting to take depending on the branch which is used for deployment?
CodePudding user response:
This is not a good approach. What you should try instead is to have only one branch and use seprate stages for test and production environment. And for production stages use filter to allow deployment just from master branch. But on test you should do deploment from develop and master too. This is because you need to test before go to production. Imagein you will not notice a issue on merging code to master branch and then you will get another chance to detect it testing on test enivronment.
It could be like
stages:
- stage: Test
displayName: Test
condition: and(succeeded(), or(eq(variables['build.sourceBranch'], 'refs/heads/master'),eq(variables['build.sourceBranch'], 'refs/heads/develop'), eq(variables['Build.Reason'], 'PullRequest'), eq(variables['Build.Reason'], 'Manual')))
jobs:
- deployment: Test
displayName: Test
environment: Test
workspace:
clean: all
strategy:
runOnce:
deploy:
steps:
- task: AzureFileCopy@4
displayName: 'copy spark jobs'
inputs:
sourcePath: $(workingDirectory)/transformed
azureSubscription: developSubscription
destination: azureBlob
storage: developStorageAccount
containerName: sourcecode
blobPrefix: myBlob
resourceGroup: developmentSourceGroup
- stage: Production
displayName: Production
condition: and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/master'))
jobs:
- deployment: Production
displayName: Production
environment: Production
workspace:
clean: all
strategy:
runOnce:
deploy:
steps:
- task: AzureFileCopy@4
displayName: 'copy spark jobs'
inputs:
sourcePath: $(workingDirectory)/transformed
azureSubscription: productionSubscription
destination: azureBlob
storage: productionStorageAccount
containerName: sourcecode
blobPrefix: myBlob
resourceGroup: productionSourceGroup
please notice that I used deployment job which will provide you for instance approvals if you configure them.