azure-pipelines

Treat Warnings as Errors using SonarQube Analysis

你离开我真会死。 提交于 2020-01-25 09:22:26
问题 I have been trying to make my solution fail to build on Visual Studio Team Services when there are warnings. I have enabled the option in the project in VS2017 to treat warnings as errors, so that it won't build. Treat warnings as errors set to All Also, there is a MsBuild argument for the same purpose set to true on VSTS. Treat warnings as errors set to true This works, as when there is a warning it is treated as error (Eg. an unused int is a warning and becomes an error, failing the build).

VSTS-Authentication with TFS Server failed. Please check your settings. TFS agent error

╄→гoц情女王★ 提交于 2020-01-25 09:09:52
问题 Two builds runs successfully using triggers but the third build faild on trigger and give error "Authentication with TFS Server failed. Please check your settings."? any solution? 回答1: As the message says just above the error you have to enable 'Allow Scripts to Access OAuth Token' in the build definition. It is possible to use a personal access token (PAT)to enable access to if you are triggering via PowerShell script or still you can rely on 'Allow Scripts to Access OAuth Token' 回答2: This

upload .pfx certificate through azure devops pipeline

无人久伴 提交于 2020-01-25 08:17:27
问题 I want to upload .pfx certificate for my app service through azure devops task. can some one please help me on how to upload certificate through ARM Template 回答1: You can follow below steps to upload certificate with ARM. 1,Go to the secure files under Pipelines, Library and upload your certificate. 2, Add a download secure file task to download your certificate to your pipeline. you can reference to it by the path $(<mySecureFile>.secureFilePath) or $(Agent.TempDirectory) . Check here for

using for-loop in azure pipeline jobs

若如初见. 提交于 2020-01-25 07:26:06
问题 I'm gonna use a for-loop which scans the files (value-f1.yaml, values-f2.yaml,...) in a folder and each time use a filename as a varibale and run the job in Azure pipeline job to deploy the helmchart based on that values file. The folder is located in the GitHub repository. So I'm thinking of something like this: pipeline.yaml stages: - stage: Deploy variables: azureResourceGroup: '' kubernetesCluster: '' subdomain: '' jobs: ${{ each filename in /myfolder/*.yaml}}: valueFile: $filename -

Azure Pipelines second job does not find results of first job

怎甘沉沦 提交于 2020-01-24 23:59:28
问题 I am getting started with azure-pipelines.yml I wanted to have 2 jobs within the same stage. One to build a solution and the other to run unit tests. The problem is that the second job executed a script step and it does not find a folder Release that the previous one should have created: trigger: - master pool: vmImage: 'ubuntu-18.04' stages: - stage: CI jobs: - job: Build steps: - task: NuGetAuthenticate@0 - script: dotnet restore --no-cache --force - script: dotnet build --configuration

Can we publish artifacts in release pipeline - Azure devOps?

百般思念 提交于 2020-01-24 23:57:11
问题 I have a java application and am trying to use Azure DevOps to build and deploy. Am able to do a build and publish the artifact in the build pipeline. In the release pipeline, I stages (dev/train/prod) in each stage I have a maven task to detokenize the build specific to the environment which I am able to do but I want to publish it as a artifact similar to the one in build pipeline. Is there any task to do that or any other alternate approach? 回答1: Can we publish artifacts in release

How to create a Build Completion Trigger after creating YAML

浪子不回头ぞ 提交于 2020-01-24 21:48:05
问题 Has anyone figured out how to actually 'use the Classic Editor' once you've gone through the effort of creating a YAML pipeline? I have two YAML pipelines I'm trying to link so pipe A kicks off after a successful completion of pipe B. According to Microsoft's own documentation: Build completion triggers are not yet supported in YAML syntax. After you create your YAML build pipeline, you can use the classic editor to specify a build completion trigger. If it was a snake, it probably would have

Share variables across stages in Azure DevOps Pipelines

被刻印的时光 ゝ 提交于 2020-01-24 19:45:05
问题 I am trying to figure out how to share custom variables across ADO pipelines in my script. Below is my script with 2 stages. I am setting the curProjVersion as an output variable and trying to access it from a different stage. Am I doing it right? stages: - stage: Build displayName: Build stage jobs: - job: VersionCheck pool: vmImage: 'ubuntu-latest' displayName: Version Check continueOnError: false steps: - script: | echo "##vso[task.setvariable variable=curProjVersion;isOutput=true]1.4.5"

Generate job matrix from all possible combinations of input parameters

对着背影说爱祢 提交于 2020-01-24 19:24:26
问题 I would like to generate jobs in Azure Pipelines using the matrix strategy, but not by explicitly listing all possible combinations. Instead of doing: matrix: core211: module: core scala: 2.11 python211: module: python scala: 2.11 libraries211: module: ibraries scala: 2.11 core212: module: core scala: 2.12 python212: module: python scala: 2.12 libraries212: module: libraries scala: 2.12 I want to do matrix: combinations: module: ["libraries", "python", "core"] scala: ["2.11", "2.12"] to

Generate job matrix from all possible combinations of input parameters

梦想的初衷 提交于 2020-01-24 19:24:11
问题 I would like to generate jobs in Azure Pipelines using the matrix strategy, but not by explicitly listing all possible combinations. Instead of doing: matrix: core211: module: core scala: 2.11 python211: module: python scala: 2.11 libraries211: module: ibraries scala: 2.11 core212: module: core scala: 2.12 python212: module: python scala: 2.12 libraries212: module: libraries scala: 2.12 I want to do matrix: combinations: module: ["libraries", "python", "core"] scala: ["2.11", "2.12"] to