jenkins-pipeline

Windows START command not working from Jenkins Pipeline

喜欢而已 提交于 2019-12-06 05:28:59
I have following code in my script. echo Trying to kill all node processes. taskkill /f /im node.exe echo Running the application... start npm run prod echo Success... The script runs fine if I open a command prompt and run it from there but it doesn't start the npm run process when I run it from Jenkins pipeline. Strange thing is the build gets success. Can anyone help me solve this riddle? Thanks. Update - 1 This is the output in Jenkins. up to date in 23.58s [Pipeline] } [Pipeline] // stage [Pipeline] stage [Pipeline] { (Deployment) [Pipeline] bat [ABC Pipeline] Running batch script *******

Jenkins job DSL plugin - hidden parameter

你离开我真会死。 提交于 2019-12-06 04:54:08
I am using the Jenkins hidden parameter plugin but I cant find the syntax to write it in DSL like I am doing with other parameters. For example: https://jenkinsci.github.io/job-dsl-plugin/#method/javaposse.jobdsl.dsl.helpers.BuildParametersContext.activeChoiceParam Is there any way to reflect hidden parameter in DSL? Job DSL has no built-in support for the Hidden Parameter plugin, so it's not mentioned in the API viewer. But it's supported by the Automatically Generated DSL : job('example') { parameters { wHideParameterDefinition { name('FOO') defaultValue('bar') description('lorem ipsum') } }

Store the console output of a build step execution in Jenkins pipeline

て烟熏妆下的殇ゞ 提交于 2019-12-06 04:49:46
问题 In my jenkins pipeline i use the "Execute shell command " to run my gradle build script. Now i want to check if the build has failed in which case i would like to read the console output, store it in a string and publish it to a slack channel. The code that i have tried goes as follows : try { for (int i = 0 ; i < noOfComponents ; i++ ){ component = compileProjectsWithPriority[i] node { out = sh script: "cd /home/jenkins/projects/${component} && ${gradleHome}/bin/gradle build", returnStdout:

Docker pipeline's “inside” not working in Jenkins slave running within Docker container

假装没事ソ 提交于 2019-12-06 04:46:42
问题 I'm having issues getting a Jenkins pipeline script to work that uses the Docker Pipeline plugin to run parts of the build within a Docker container. Both Jenkins server and slave run within Docker containers themselves. Setup Jenkins server running in a Docker container Jenkins slave based on custom image (https://github.com/simulogics/protokube-jenkins-slave) running in a Docker container as well Docker daemon container based on docker:1.12-dind image Slave started like so: docker run -

Perform a git fetch in pipeline 2.0 groovy script

99封情书 提交于 2019-12-06 04:45:17
There is an open bug on jenkins 2.0 pipeline scripts relating to included regions in git, so it means for a large mono-repo as in my case each checkin to master will cause multiple pipelines to be kicked off which is not the desired behavior. So to visualize: top-level: ->application folder 1 ->application folder 2 What I want to do is to do a git fetch first so then I can do a git diff to see if anything in the particular folder has changed and if it has then run the pipeline for that particular folder and not do anything if nothing changed Code I have is below: node{ git credentialsId: 'cred

Jenkins Pipelines: Re-use workspace when loading an external Jenkins pipeline script

左心房为你撑大大i 提交于 2019-12-06 03:57:09
I have the following use case: Checkout/pull a certain Git revision, using written pipeline script (I need this because I retrieve the revision dynamically) From that revision, load a Jenkins-pipeline-file, located among the previously checked out files This file would rely on files from the same checked out revision (thus, from the same workspace) Problem: The loaded Jenkins-pipeline-file gets executed in a new workspace. But it is empty. I need that file to get executed in the same old workspace. I thought, perhaps it's because of surrounding node , because the node keyword creates

With Git feature branch workflow, when do you update the master branch?

爱⌒轻易说出口 提交于 2019-12-06 03:56:33
I'm fairly new to git and Jenkins. We want to use Jenkins and follow the feature-branch-workflow concept , which I believe is similar to the GitHub flow . I'm aware that the master branch should always be what's currently deployed in production, but then when should the master branch be updated? It seems like there are two choices: BEFORE deploying to production: A pull request gets approved and the successful merge with master triggers the build, deployment to a staging environment, QA testing, and then someone pushes a button to deploy to production AFTER deploying to production: Something

Mocking jenkins pipeline steps

强颜欢笑 提交于 2019-12-06 02:43:52
I have a class that i use in my jenkinsfile, simplified version of it here: class TestBuild { def build(jenkins) { jenkins.script { jenkins.sh(returnStdout: true, script: "echo build") } } } And i supply this as a jenkins parameter when using it in the jenkinsfile. What would be the best way to mock jenkins object here that has script and sh ? Thanks for your help I had similar problems the other week, I came up with this: import org.jenkinsci.plugins.workflow.cps.CpsScript def mockCpsScript() { return [ 'sh': { arg -> def script def returnStdout // depending on sh is called arg is either a

Publish Multiple Robot Test Results From Jenkins Pipeline

三世轮回 提交于 2019-12-06 02:35:39
问题 I have a Jenkins 2.0 Pipeline script that runs a two separate suite of Robot tests. The script tries to publish both test suite results, however the publisher over-writes the first publish, with the last one. node('robot') { ... publishTestResults('journey') publishTestResults('regression') } void publishTestResults(String type) { step([ $class : 'hudson.plugins.robot.RobotPublisher', outputPath : 'portfolio-app\\target\\robot-output\\' + type, passThreshold : 100, unstableThreshold: 100,

How to pip install in a docker image with a jenkins pipline step?

喜夏-厌秋 提交于 2019-12-06 02:02:45
I have this Dockerfile : FROM python:3.7 CMD ["/bin/bash"] and this Jenkinsfile : pipeline { agent { dockerfile { filename 'Dockerfile' } } stages { stage('Install') { steps { sh 'pip install --upgrade pip' } } } This causes the following error: The directory '/.cache/pip/http' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag. The directory '/.cache/pip' or its parent directory is not owned by the current user and caching wheels has been