jenkins-pipeline

Define global environment variables from inside a stage

三世轮回 提交于 2019-12-08 04:37:56
问题 I have env vars defined in my environment directive at the top of the pipeline: environment { var1 = 'sdfsdfdsf' var2 = 'sssssss' } But there are some that I need to dynamically set or override in the stages. But if I use an environment{} directive in a stage the vars won't be accessible to other stages. Initially I thought I could define them all with default values in the top environment directive and overwrite them in the pipeline but this is the behavior I observed: Define var in

Jenkins sending notifications to the wrong commit id

有些话、适合烂在心里 提交于 2019-12-08 03:58:27
I have several Jenkins pipelines, all importing a shared library from Bitbucket for some utility methods, and I want to send build status notifications to each project's own Bitbucket repo. I installed the Bitbucket build status notifier plugin, but I'm experiencing a weird behavior: when bitbucketStatusNotify is being called in my pipeline, this happens: Sending build status INPROGRESS for commit <sha> to BitBucket is done! And that would be ok, but <sha> is the commit id of the last commit on the shared library, not on the actual project being built, so build status notifications are

how to fix - stageResult set to FAILURE but still get success in jenkins

落爺英雄遲暮 提交于 2019-12-08 03:36:57
问题 I'm trying to create a very simple pipeline, it has one stage and one step. it uses the job 'build' I created as freestyle (which works) but I added an error (the parameter project name has a wrong value - 'test3' instead of 'test') when I ran it, it stay green and send "success" although it failed - if I enter the log I'll see this: Running in Durability level: MAX_SURVIVABILITY [Pipeline] Start of Pipeline [Pipeline] node Running on Jenkins in C:\Program Files (x86)\Jenkins\workspace

How to configure order in trigger parameterized build on other projects jenkins

非 Y 不嫁゛ 提交于 2019-12-08 03:34:55
问题 I have a JOB {A} which is having trigger parameterized build on other projects in POST Build Actions I have set 2 Jobs {B} and {C}. I wanna make configuration in such that First if A is Stable then trigger B and After trigger C. But i don not want to make C as a child to B. 回答1: In JOB {A} you need waiting while JOB {B} running, when you run JOB {C}. For example: JOB {A} //do something build job: "JOB {B}", quietPeriod: 0, wait: true //"quietPeriod: 0" without pause, "wait: true" waiting

Jenkins builds being triggered despite “Don't trigger a build on commit notifications”

社会主义新天地 提交于 2019-12-08 03:02:48
问题 I have a Pipeline job that checkouts a git repository (let's call is "repoA") and passes it to some other downstream jobs for further processing. The upstream job's script is stored in a different git repo (let's call it "repoB"). This job is configured with the "Poll SCM" option so that any changes to repoA will trigger it. In the pipeline section, I have selected the "Pipeline script from SCM" option and configured it to get the pipeline script from the master branch of repoB. I have also

Perform a git fetch in pipeline 2.0 groovy script

♀尐吖头ヾ 提交于 2019-12-08 02:52:12
问题 There is an open bug on jenkins 2.0 pipeline scripts relating to included regions in git, so it means for a large mono-repo as in my case each checkin to master will cause multiple pipelines to be kicked off which is not the desired behavior. So to visualize: top-level: ->application folder 1 ->application folder 2 What I want to do is to do a git fetch first so then I can do a git diff to see if anything in the particular folder has changed and if it has then run the pipeline for that

Jenkins pipeline stages - passing whole file

邮差的信 提交于 2019-12-08 02:14:48
问题 Running a Jenkins pipeline (based on Groovy) with stages containing many nodes, I need to pass a list from some file on NodeA on stageA to nodeB on StageB. In stageA NodeA I run DEVenv = readFile 'somefile.txt' In stageB I run println DEVenv So far so good, I get the output in the console. Now how to pass the output of that println DEVenv to a file? println DEVenv > otherfile.txt doesn't do the trick :-( I'm sure it's not such a big deal but I've been churning the internet for a couple of

Jenkins Pipeline job isn't triggered on GitHub push

心已入冬 提交于 2019-12-08 02:03:59
问题 I've created Jenkins Pipline job and want it to be triggered on my GitHub repo push event. I've added repo url to job config and checked "trigger on push option": I've also added GitHub token with needed rights to jenkins configure Github section: In Github repo I've enabled webhook for my Jenkins server: And after all steps still nothing is triggered after push to my GitHub repo. Does anyone have any idea what's going on and why Jenkins doesn't trigger configured pipeline job? 回答1: Solution

Jenkins Multibranch Pipelines - Configuring properties in branches?

ぐ巨炮叔叔 提交于 2019-12-08 02:01:14
问题 We have successfully set up a build pipeline using the Jenkins Multibranch Pipeline Plugin, which works great most of the time, but we have this problem which nags us: The Jenkinsfile contains a set of properties, and these also show up in the UI, but how can I set up default values for individual branches? This is how the properties definitions look like in our Jenkinsfile : properties([ parameters([ string(defaultValue: 'somevalue', description: 'Some description', name: 'SOME_VALUE'),

Jenkinsfile - How to pass parameters for all the stages

帅比萌擦擦* 提交于 2019-12-08 00:29:29
问题 To explain the issue, consider that I have 2 jenkins jobs. Job1 : PARAM_TEST1 it accepts a parameterized value called 'MYPARAM' Job2: PARAM_TEST2 it also accepts a parameterized value called 'MYPARAM' Sometimes I am in need of running these 2 jobs in sequence - so i created a separate pipeline job as shown below. It works just fine. it also accepts a parameterized value called 'MYPARAM' to simply pass it to the build job steps. pipeline { agent any stages { stage("PARAM 1") { steps { build