Using Robocopy to deploy sites

点点圈 提交于 2019-12-01 18:36:40

What we use is the following

  • first build the website with msbuild in cruisecontrol.net to build the binaries
  • archive the currently deployed files under a timestamped folder to avoid losing data in case of a problem

    C:\DevTools\Robocopy\robocopy.exe /R:1 /W:10 /mir "D:\WebSite\Files" "D:\Webarchive\ArchivedFiles\Documents.%date:~0,-8%.%date:~3,-5%.%date:~6%.%time:~0,-9%.%time:~3,-6%.%time:~6,-3%" /XF *.scc

  • stop the website

  • deploy the website by copying everything except the files we archived (/XD is eXclude Directory)

    C:\DevTools\Robocopy\robocopy.exe /R:1 /W:10 /mir "c:\dev\site" "D:\WebSite" /XF *.scc /XD "D:\WebSite\Files"

  • copy and rename (with xcopy, this time) a release.config with correct information to d:\Website\web.config (in fact, that's what we used to do, now we have a homebrew transformation engine to change parts of the dev web.config on the fly).

  • restart the website
  • (optional) delete the archive you made at step two

In your case, you'll have to add the /XD flags for any directory you want to ignore, such as the users' upload. And unless the production web.config file is complicated, i'd really recommend simply copying a release.config that you maintain as a part of the project, side by side with the web.config

Is Robocopy a hard requirement? Why not use MSBuild? Everything you have listed can painlessly be done in MSBuild.

<!-- Attempt to build new code -->
<MSBuild Projects="$(BuildRootPath)\ThePhotoProject.sln" Properties="Configuration=$(Environment);WebProjectOutputDir=$(OutputFolder);OutDir=$(WebProjectOutputDir)\" />

<!-- Get temp file references -->
<PropertyGroup>
  <TempConfigFile>$([System.IO.Path]::GetTempFileName())</TempConfigFile>
  <TempEnvironmentFile>$([System.IO.Path]::GetTempFileName())</TempEnvironmentFile>
</PropertyGroup>

<!-- Copy current web configs to temp files -->
<Copy SourceFiles="$(OutputFolder)\web.config" DestinationFiles="$(TempConfigFile)"></Copy>
<Copy SourceFiles="$(OutputFolder)\web.$(Environment).config" DestinationFiles="$(TempEnvironmentFile)"></Copy>
<ItemGroup>
  <DeleteConfigs Include="$(OutputFolder)\*.config" />
</ItemGroup>

<Delete Files="@(DeleteConfigs)" />

...

<!-- Copy app_offline file -->
<Copy SourceFiles="$(CCNetWorkingDirectory)\Builder\app_offline.htm"  DestinationFiles="$(DeployPath)\app_offline.htm"  Condition="Exists('$(CCNetWorkingDirectory)\Builder\app_offline.htm')"  />

<ItemGroup>
  <DeleteExisting Include="$(DeployPath)\**\*.*" Exclude="$(DeployPath)\app_offline.htm" />      
</ItemGroup>

<!-- Delete Existing files from site -->
<Delete Files="@(DeleteExisting)"  />
<ItemGroup>
  <DeployFiles Include="$(OutputFolder)\**\*.*" />
</ItemGroup>

<!-- Deploy new files to deployment folder. -->
<Copy SourceFiles="@(DeployFiles)"  DestinationFiles="@(DeployFiles->'$(DeployPath)\%(RecursiveDir)%(Filename)%(Extension)')"  />

<!-- Delete app_offline file -->
<Delete Files="$(DeployPath)\app_offline.htm" Condition="Exists('$(DeployPath)\app_offline.htm')"  />

On Nix based servers i would use RSYNC and i understand that on Windows you can use DeltaCopy which a port of RSYNC and is open sources (never used DeltaCopy so please check it carefully) Anyway assuming it works like RSYNC then it is fast and only updates files that have been changed.

You can use various configuration options to delete files on the target that have been deleted on the source and you can also use an add in a file that will exclude files or directories, i.e. the local config, you do not want copying. etc.

You should be able to fold it all into one script to run when required which means you can test and time it so you know what is happening.

Check out these links to see if they help:

You'll find that robocopy.exe /? is extremely helpful. In particular you'll want the /XF switch for excluding files, and /XD for excluding folders.

You will need to write a script (e.g. bat, powershell, cscript) to take care of the web.config issues though.

Microsoft themselves use robocopy to deploy updates to some sites.

I don't know if you have multiple servers, but our deployment script went something like: 1) Stop IIS (which would take the server out of load-balancer rotation, 2) RoboCopy /MIR from \STAGING\path\to\webroot to \WEB##\path\to\webroot where ## is the number of the server, 3) Start IIS. This was done after the site was smoke-tested on the staging server.

That doesn't much help with your config problem, but our staging and production config files were the same.

What you need (and I need) is a synchronize program with the ability to create backup of the files on the server, and make quick copy over ftp of the files at ones by probably copying them first on a temporary directory, or by partial updating.

This is one program that I found : http://www.superflexible.com/ftp.htm

WebDeploy is a much better way to handle deploys (see Scott H http://www.hanselman.com/blog/WebDeploymentMadeAwesomeIfYoureUsingXCopyYoureDoingItWrong.aspx)

But, Robocopy is a great low-cost deploy tool that I still use on some sites (haven't find the time to change them to webdeploy). Robocopy is like xcopy but with a much richer set of options. So you would need 2 Robocopy commands (1 for backup and 1 for deploy). I normally do the backup command when the files are staged.

Managing config files is always tricky (and a big reason to use webdeploy). One approach, is keep a copy of the config files for each environment checked into your source control (eg, web.dev.config, web.uat.config, web.prod.config, etc). The staging (or deploy script) would grab and rename the necessary config file.

You would probably need to use a combination of tools.

I would have a look at DFSR (File Server role) with a read-only folder on your live site (so it's one-way replication).

It is very easy to configure, has a nice GUI, ability to exclude files based on location and/or masks, and with Volume Shadow Copy enabled you can have it running on schedules you set and updating those files that change only (or have it run on a schedule, or even run it manually). The beauty of this is once it is configured, you don't have to touch it again.

Once you have the bulk of your files replicating you could then get assistance in automating the possible merge on web.config, assuming you want that automated.

Doug H

MSBuild is great, except for one minor (or major depending on your point of view) flaw. It rebuilds the binaries every time you run a build. This means, for deploying from TEST to PRODUCTION, or STAGE to PRODUCTION (or whatever your pre-production environment is called), if you use MSBuild, you are not promoting existing binaries from one environment to the next, you are re-building them. This also means that you are relying, with certainty, that NOTHING has changed in the source code repository since you did an MSBuild to your pre-production environment. Allowing even the slightest chance of a change to anything, major or minor, means you will not be promoting a fully tested product into your production environment. In the places I work, that is not an acceptable risk.

Enter Robocopy. With Robocopy, you are copying a (hopefully) fully tested product to your production environment. You would then either need to manually modify your web.config/app.config to reflect the production environment, OR use a transformation tool to do that. I have been using the "Configuration Transformation Tool" available on SourceForge for that purpose - it works just like the MSBuild web/app.config transformations.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!