Using BitBucket Pipelines to Deploy onto VPS via SSH Access

情到浓时终转凉″ 提交于 2020-08-20 18:07:34

问题


I have been trying to wrap my head around how to utilise BitBucket's Pipelines to auto-deploy my (Laravel) application onto a Vultr Server instance.

I have the following steps I do manually, which I am trying to replicate autonomously:

  • I commit my changes and push to BitBucket repo
  • I log into my server using Terminal: ssh root@ipaddress
  • I cd to the correct directory: cd /var/www/html/app/
  • I then pull from my BitBucket repo: git pull origin master
  • I then run some commands: composer install, php artisan migrate etc..
  • I then log out: exit

My understanding is that you can use Pipelines to automatise this, is this true?

So far, I have set up a SSH key pair for pipelines and my server, so my server's authorized_keys file contains the public key from BitBucket Pipelines.

My pipelines file bitbucket-pipelines.yml is as follows:

image: atlassian/default-image:latest

pipelines:
  default:
    - step:
        deployment: staging
        caches:
          - composer
        script:
          - ssh root@ipaddress
          - cd /var/www/html/app/
          - git pull origin master
          - php artisan down
          - composer install --no-dev --prefer-dist
          - php artisan cache:clear
          - php artisan config:cache
          - php artisan route:cache
          - php artisan migrate
          - php artisan up
          - echo 'Deploy finished.'

When the pipeline executes, I get the error: bash: cd: /var/www/html/app/: No such file or directory.

I read that each script step is run in it's own container.

Each step in your pipeline will start a separate Docker container to run the commands configured in the script

The error I get makes sense if it's not executing cd /var/www/html/app within the VPS after logging into it using SSH.

Could someone guide me into the correct direction?

Thanks


回答1:


The commands you are defining under script are going to be run into a Docker container and not on your VPS.

Instead, put all your commands in a bash file on your server.

1 - Create a bash file pull.sh on your VPS, to do all your deployment tasks

#/var/www/html
php artisan down
git pull origin master
composer install --no-dev --prefer-dist
php artisan cache:clear
php artisan config:cache
php artisan route:cache
php artisan migrate
php artisan up
echo 'Deploy finished.'

2 - Create a script deploy.sh in your repository, like so

echo "Deploy script started"
cd /var/www/html
sh pull.sh
echo "Deploy script finished execution"

3 - Finally update your bitbucket-pipelines.yml file

image: atlassian/default-image:latest

pipelines:
  default:
    - step:
        deployment: staging
        script:
          - cat ./deploy.sh | ssh <user>@<host>
          - echo "Deploy step finished"

I would recommend to already have your repo cloned on your VPS in /var/www/html and test your pull.sh file manually first.




回答2:


The problem with the answer marked as the solution is that the SH process won't exit if any of the commands inside fails.

This command php artisan route:cache for instance, can fail easily! not to mention the pull!

And even worse, the SH script will execute the rest of the commands without stop if any fail.

I can't use any docker command because after each, the CI process stops and I can't figure out how to avoid those commands to not exit the CI process. I'm using the SH but I'll start adding some conditionals based on the exit code of the previous command, so we know if anything went wrong during the deploy.



来源:https://stackoverflow.com/questions/50053687/using-bitbucket-pipelines-to-deploy-onto-vps-via-ssh-access

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!