jenkins-pipeline

Parsing an XML file within a Jenkins pipeline

旧城冷巷雨未停 提交于 2019-12-05 01:36:21
问题 I have an XML file which I'd like to use as input for a pipeline script. Problem is the XMLParser isn't serializable so I put it in a NonCPS function, but I lost the Node object because of that. This is the pipeline script: def buildPlanPath = 'C:\\buildPlan_test.xml' @NonCPS groovy.util.Node getBuildPlan(path) { new XmlParser().parseText(readFile(path)) } node { //def buildPlan = new XmlParser().parseText(readFile(buildPlanPath)) groovy.util.Node buildPlan = getBuildPlan(buildPlanPath)

Ideas to implement dynamic parallel build using jenkins pipeline plugin

不问归期 提交于 2019-12-04 23:51:16
问题 I have a requirement to run a set of tasks for a build in parallel, The tasks for the build are dynamic it may change. I need some help in the implementation of that below are the details of it. I tasks details for a build will be generated dynamically in an xml which will have information of which tasks has to be executed in parallel/serial example: say there is a build A. Which had below task and the order of execution , first task 1 has to be executed next task2 and task3 will be executed

jenkins pipeline: multiline shell commands with pipe

☆樱花仙子☆ 提交于 2019-12-04 22:40:12
I am trying to create a Jenkins pipeline where I need to execute multiple shell commands and use the result of one command in the next command or so. I found that wrapping the commands in a pair of three single quotes ''' can accomplish the same. However, I am facing issues while using pipe to feed output of one command to another command. For example stage('Test') { sh ''' echo "Executing Tests" URL=`curl -s "http://localhost:4040/api/tunnels/command_line" | jq -r '.public_url'` echo $URL RESULT=`curl -sPOST "https://api.ghostinspector.com/v1/suites/[redacted]/execute/?apiKey=[redacted]

Jenkins 2 Multibranch Pipelines - How can I limit the visibility/execution of branches using the Role Strategy Plugin?

非 Y 不嫁゛ 提交于 2019-12-04 22:18:46
问题 I am using multibranch pipelines in projects with two branches: develop and master. This creates two subprojects, one for each branch: App_Pipeline |---master |---develop I have set up the Role Strategy plugin to control the authorization (visibility) of the jobs/pipelines depending on the assigned role. Project Roles : manager : Uses a regexp App_.* developer : Uses a regexp App_.* With my current roles, both types of users see the superproject (App_Pipeline), and can execute both

Docker command not found in local Jenkins multi branch pipeline

夙愿已清 提交于 2019-12-04 21:00:37
问题 I have BookStore Spring Boot project that needs to be deployed through Jenkins. Docker installed in my local machine (macOS) and Jenkinsfile created as follows pipeline { agent { docker { image 'maven:3-alpine' //This exposes application through port 8081 to outside world args '-u root -p 8081:8081 -v /var/run/docker.sock:/var/run/docker.sock ' } } stages { stage('Build') { steps { sh 'mvn -B -DskipTests clean package' } } stage('Test') { steps { //sh 'mvn test' sh 'echo "test"' } post {

How to set Jenkins Declarative Pipeline environments with Global Variables?

三世轮回 提交于 2019-12-04 20:41:33
I am trying to do this pipeline { agent any environment { LOCAL_BUILD_PATH=env.WORKSPACE+'/build/' } stages { stage('Stuff'){ steps{ echo LOCAL_BUILD_PATH } } } } Result: null/build/ How can I use Global Environments to create my environments? I think you should use: steps { echo "${env.LOCAL_BUILD_PATH}" } as in "environment" step you're defining environmental variables which are later accessible by env.your-variable-name So this is method that I ended up using pipeline { agent { label 'master' } stages { stage ("Setting Variables"){ steps { script{ LOCAL_BUILD_PATH = "$env.WORKSPACE/build" }

Halt a jenkins pipeline job early

[亡魂溺海] 提交于 2019-12-04 17:45:59
问题 In our Jenkins Pipeline job we have a couple of stages, and what I would like is if any of the stages fail, then to have the build stop and not continue on to the further stages. Here's an example of one of the stages: stage('Building') { def result = sh returnStatus: true, script: './build.sh' if (result != 0) { echo '[FAILURE] Failed to build' currentBuild.result = 'FAILURE' } } The script will fail, and the build result will update, but the job continues on to the next stages. How can I

why am I not able to run batch file in jenkins pipeline running in windows 10?

时光怂恿深爱的人放手 提交于 2019-12-04 16:51:12
I'm trying to run a batchscript present inside the workspace of jenkins. I have written a groovy script as below to do this stage('batchscript') { steps{ bat 'start cmd.exe /c C:\\Program Files (x86)\\Jenkins\\workspace\\jenkins Project\\batchfile.bat'\ } } when I build the job it should open a new command window and run my batch file in a new command prompt executing all the bat commands. The build is succesful but no command window opens up. Any suggestion will be helpfull Jenkins is aimed to execute shell commands in background mode, not for interactive mode. Single line If you need to

Read interactive input in Jenkins pipeline to a variable

百般思念 提交于 2019-12-04 15:36:57
In a Jenkins pipeline, i want to provide an option to the user to give an interactive input at run time. I want to understand how can we read the user input in the groovy script. Request to help we with a sample code: I'm referring to following documentation: https://jenkins.io/doc/pipeline/steps/pipeline-input-step/ EDIT-1: After some trials i've got this working: pipeline { agent any stages { stage("Interactive_Input") { steps { script { def userInput = input( id: 'userInput', message: 'Enter path of test reports:?', parameters: [ [$class: 'TextParameterDefinition', defaultValue: 'None',

Sync two git repositories Jenkins Pipeline

空扰寡人 提交于 2019-12-04 15:25:01
I want to syncronize two repositories each time a build is been made, I have seen this script but I don't know how to set the remote branch with credentials too. # clone the reposotory git clone --bare $ORIGIN_URL # add a remote repository cd $REPO_NAME git remote add --mirror=fetch repo1 $REPO1_URL # update the local copy from the first repository git fetch origin --tags # update the local copy with the second repository git fetch repo1 --tags # sync back the 2 repositories git push origin --all git push origin --tags git push repo1 --all git push repo1 --tags Pipeline: node('centos-small') {