Bitbucket Pipelines - How to use the same Docker container for multiple steps?

穿精又带淫゛_ 提交于 2021-01-26 09:28:59

问题


I have set up Continuous Deployment for my web application using the configuration below (bitbucket-pipelines.yml).

pipelines:
  branches:
    master:
        - step:
            name: Deploy to production
            trigger: manual
            deployment: production
            caches:
              - node
            script:
              # Install dependencies
              - yarn install
              - yarn global add gulp-cli

              # Run tests
              - yarn test:unit
              - yarn test:integration

              # Build app
              - yarn run build

              # Deploy to production
              - yarn run deploy

Although this works, I would like to increase the build speed by running the unit and integration test steps in parallel.

What I've tried

pipelines:
  branches:
    master:
        - step:
            name: Install dependencies
            script:
              - yarn install
              - yarn global add gulp-cli

        - parallel:
            - step:
                name: Run unit tests
                script:
                  - yarn test:unit
            - step:
                name: Run unit tests
                script:
                  - yarn test:integration

        - step:
            name: Build app
            script:
              - yarn run build

        - step:
            name: Deploy to production
            trigger: manual
            deployment: production
            script:
              - yarn run deploy

This also has the advantage of seeing the different steps in Bitbucket including the execution time per step.

The problem

This does not work because for each step a clean Docker container is created and the dependencies are no longer installed on the testing steps.

I know that I can share files between steps using artifacts, but that would still require multiple containers to be created which increases the total execution time.

So my question is...

How can I share the same Docker container between multiple steps?


回答1:


I've had the same issue a while ago and found a way to do it and I'm using it successfully right now.

You can do this using Docker's save and load along with BitBucket's Artifacts. You just need to make sure that your image isn't too large because BitBucket's Artifacts limit is 1GB and you can easily ensure this using multi stage-builds and other tricks.

- step: name: Build app script: - yarn run build - docker save --output <backup-file-name>.tar <images-you-want-to-export> artifacts: - <backup-file-name>.tar - step: name: Deploy to production trigger: manual deployment: production script: - docker load --input <backup-file-name>.tar - yarn run deploy

You might also like to use BitBucket's caches which can make building Docker images much faster. For example, you can make it so that NPM packages are only installed when package.json and yarn.lock files change.

Further Reading

  • docker save (Docker 17): https://devdocs.io/docker~17/engine/reference/commandline/save/index
  • docker load (Docker 17): https://devdocs.io/docker~17/engine/reference/commandline/load/index
  • BitBucket Artifacts: https://confluence.atlassian.com/bitbucket/using-artifacts-in-steps-935389074.html
  • BitBucket Pipelines Caches: https://confluence.atlassian.com/bitbucket/caching-dependencies-895552876.html



回答2:


Each step runs in its own Docker container, and its own volume. So you cannot have two steps running on the same build container.

Diving deeper into your problem

Are you trying to optimize for build minute consumption, or how long it takes for your build to complete?

If you're optimizing for build minutes, stick with what you have now. As the overhead of using multiple steps and artifacts will add some build minutes. But you'll lose out on the flexibility these features provide. Additionally, you can try ensure you're using a small Docker image for your build environment, as that will be pulled faster.

If you're optimizing for pipeline completion time, I'd recommend you go with your idea of using artifacts and parallel steps. While the total execution time is expected to be higher, you will be waiting for less time to see the result of your pipeline.




回答3:


Possible solution which i recommend:

  - step:
        name: Install dependencies
        script:
          - yarn install
          - yarn global add gulp-cli

Your first step above should be in a pre-build docker container, which you host on Docker Hub and use for deployment via image: username/deployment-docker:latest.

Then, both steps can use this container for their tests.



来源:https://stackoverflow.com/questions/51262595/bitbucket-pipelines-how-to-use-the-same-docker-container-for-multiple-steps

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!