jenkins-pipeline

Pass Jenkins build parameters to pipeline nodes

大兔子大兔子 提交于 2019-12-01 02:00:28
问题 I created a new Jenkins pipeline. The pipeline is (currently) parametrized with a single boolean option named VAR_A . My pipeline script is: node ('windows') { echo "$VAR_A" bat 'env' } When I manually build the project with VAR_A checked, "true" is echoed, as expected. The list of environment variables, however, does not show VAR_A=true . I am able to get env to show VAR_A if I wrap the call in a withEnv block: node ('windows') { echo "$VAR_A" withEnv(["VAR_A=$VAR_A"]) { bat 'env' } } I will

Jenkins Pipeline Syntax for “p4sync”

只谈情不闲聊 提交于 2019-12-01 01:49:07
问题 I'm trying to sync to Perforce in my pipeline script, but from the documentation I don't see a way to set the "workspace behavior", even though the plugin itself seems to have that capability. I want the "workspace" to be equivalent to the setting "Manual (custom view)" I can configure in the UI as described here. What parameters do I need to pass to the p4sync task to achieve that? 回答1: You will need to use the full checkout DSL, the p4sync DSL is only basic. The easiest way is to use the

Jenkins Multi-branch pipeline doesn't schedule tag jobs

自古美人都是妖i 提交于 2019-12-01 01:29:40
问题 I'm trying to get Jenkins' multibranch pipeline job to build tags in a similar manner to branches. In Jenkins 2.73 (not sure when the functionality was added), Multibranch projects can be configured to retrieve both branches and tags from the source repository. Initially I thought this would be perfect for my needs (my Jenkinsfile can now build development or production builds from the same place in Jenkins). Multibranch job with tags discovery configured I have the build process itself up

Jenkins-pipeline Extract and Set variables from properties file in groovy

浪子不回头ぞ 提交于 2019-11-30 23:28:45
To begin i'm writing the pipeline entirely as groovy to be checked in to git. Please do not provide any gui necessary solutions. My Problem statement is: Extract a variable from a file and set it equal to a groovy object. What i've tried def SERVICE_MAJOR_VERSION node { runGitClone(GIT_REPO_URL, GIT_HASH) def conf = readFile("gradle.properties") echo conf //THE BELOW COMMENT DOESN'T WORK //SERVICE_MAJOR_VERSION = loadEnvFromFile("SERVICE_VERSION_MAJOR", "gradle.properties", true, SERVICE_VERSION_MAJOR) } def runGitClone(git_repo_url, git_hash) { checkout changelog: false, poll: false, scm: [

Jenkins continue pipeline on failed stage

守給你的承諾、 提交于 2019-11-30 22:55:50
I have a jenkins setup with a bunch of pipelines. I wrote a new pipeline which can start all pipelines at once. I would like to build other stages, even if one of them fails. The script currently looks like this stage 'CentOS6' build 'centos6.testing' stage 'CentOS7' build 'centos7.testing' stage 'Debian7' build 'debian7-x64.testing' stage 'Debian8' build 'debian8-x64.testing' The build scripts itself contain the node they should run on. How can the script continue with the following stages even if one of them fails. Cheers If you use the parallel step, this should work as you expect by

Can I use a Closure to define a stage in a Jenkins Declarative Pipeline?

不问归期 提交于 2019-11-30 22:19:43
I'm trying to do something like this: def makeStage = { stage('a') { steps { echo 'Hello World' } } } pipeline { agent none stages { makeStage() } } But it gives me this exception: WorkflowScript: 11: Expected a stage @ line 11, column 5. makeStage() ^ Is it possible to define a stage as a external closure and if so - how? You can't define stages outside the declarative pipeline. The main purpose of declarative pipeline is to provide simplified and opinionated syntax so you can focus on what should be done (by using some of the available steps ) and not how to do it. If you are interested in

How to get svn version number from checkout for use in dsl

只谈情不闲聊 提交于 2019-11-30 22:06:04
I created a pipeline job and would like to get the svn version number to enable further downstream processing in a call to a shell script. I am using a pipeline script similar to the following: node { // Mark the code checkout 'stage'.... stage 'Checkout' // Get some code from a SVM repository checkout( [ $class: 'SubversionSCM', additionalCredentials: [], excludedCommitMessages: '', excludedRegions: '', excludedRevprop: '', excludedUsers: '', filterChangelog: false, ignoreDirPropChanges: false, includedRegions: '', locations: [ [ ... ] ], workspaceUpdater: [$class: 'UpdateUpdater'] ] ) def

Docker in Docker - volumes not working: Full of files in 1st level container, empty in 2nd tier

梦想与她 提交于 2019-11-30 21:32:29
I am running Docker in Docker (specifically to run Jenkins which then runs Docker builder containers to build a project images and then runs these and then the test containers). This is how the jenkins image is built and started: docker build --tag bb/ci-jenkins . mkdir $PWD/volumes/ docker run -d --network=host \ -v /var/run/docker.sock:/var/run/docker.sock \ -v /usr/bin/docker:/usr/bin/docker \ -v $PWD/volumes/jenkins_home:/var/jenkins_home \ --name ci-jenkins bb/ci-jenkins Jenkins works fine. But then there is a Jenkinsfile based job, which runs this: docker run -i --rm -v /var/jenkins_home

How to use @Library in an imported groovy script in Jenkins declarative pipeline?

孤者浪人 提交于 2019-11-30 21:20:53
What I have is a following: global shared library created as described here . Nothing special, one script in vars folder called deleteFile.groovy , tried it - works. Library is called myOneLib a pipeline script called firstPipe.groovy @Library('myOneLib') _ def execute(String zCmakeListsPath){ stage('some kind of stage 2') { echo "Hello from stage 1 with " + zCmakeListsPath echo "var attempt ${env.mySrcDir}" } stage('second stage'){ echo "and one from stage 2" echo "param was " + zCmakeListsPath echo "var attempt ${env.myBuildDir}" //call function from global lib deleteFile 'for 3rd party

Jenkins multi-branch pipeline and specifying upstream projects

一世执手 提交于 2019-11-30 20:50:23
We currently generate a lot of Jenkins jobs on a per Git branch basis using Jenkins job DSL; the multi-branch pipeline plugin looks like an interesting way to potentially get first-class job generation support using Jenkinsfiles and reduce the amount of Job DSL we maintain. For example we have libwidget-server and widget-server develop branch projects. When the libwidget-server build finishes then the widget-server job is triggered (for the develop branch). This applies to other branches too. This makes use of the Build after other projects are built to trigger upon completion of an upstream