I have a GitHub repo that\'s big and contains several independently build-able bits. If I configure Jenkins with a job (or two) for each of these, I end up with having to pu
I had the same experience.
I have one job to pull the real remote repo, which is github.
Each of the other jobs (there are many) has a "Repository URL" like this:
file:///C:/Program Files (x86)/Jenkins/jobs/webtest-local-repo/workspace/.git
It clones fine, but subsequent fetches don't notice any changes.
The same issue presents in gitbash, so i guess this is a git issue, not a jenkins issue.
My horrific workaround was to make the dependent jobs delete their workspaces when finished building, so that every git operation is a "clone." It's ridiculous, but maybe less ridiculous than having a zillion jobs banging away at the same github repo.
ZOMG! That didn't work either, because while git could successfully clone the repo, jenkins would remember the previous revision and build the very same one again. Perhaps it's related to this issue, i dunno, i'm pretty fed up. We gave up, and now all the jobs poll github again. Maybe i'll get a hook working instead.
Have a look at the Clone Workspace plugin. You can either use that or configure a job to update a local repository from Github and then have all the other jobs pull from that local repo.
This won't help with the problem that the workspaces still need the diskspace, but as far as I know there's no simple solution for that. You could either have the build steps change to a shared directory outside the workspace, but that's hacky and might break other things. Alternatively, you could use a filesystem that provides deduplication.
If you open the job configuration and click on the Advanced button of the git SCM configuration, you will see a place to specify "Path of the reference repo to use during clone (optional)".
If you have a local clone of your repository, add the path to the reference repo field.
Git will then use the local clone and share most of the git objects on the disk and pulling from github only what is missing from the local clone resulting in lightning fast clones and saved disk space.
Or is this exactly how you have configured your job and it is not picking up latest commits? If that is so, please provide more details. Consider publishing your job configuration.
$WORKSPACE/../ReferenceRepo/label/$NODE_NAME
, where ReferenceRepo
is the name of the job you created above.You will need to re-clone each of your jobs on each machine for this to take effect. They should now use the data from the reference repo for everything they can, saving you disk space and bandwidth on your build nodes.