I have a collection of Mercurial repositories on a network share. To enable offline work, I want a local copy of this collection on my laptop, and an easy way to synchronize the two when I'm online.
For this, I wrote a quick script that automatically synchronizes each local repository with the corresponding remote repository (push and pull), but it's missing a couple of desirable features:
- automatic cloning of new repositories from the local to the remote collection (and vice versa)
- the ability to organize (move/rename) a local repository and have the change being applied on the remote side as well, the next time I synchronize
- the ability to synchronize
hg strip
and other commands that rewrite repository history - the ability to synchronize against a
hgwebdir
collection or even Bitbucket
Are there any existing solutions that provide some (or all) of these features?
To my knowledge nothing like this exists. The safest way to move changesets back and forth between repositories is always hg push
and hg pull
and none of those command operate on more than one source or destination repository.
For backup purposes I've done something like this before:
for thedir in $(find . -type d -name .hg) ; do
repopath=$(dirname $thedir)
hg push $repopath ssh://mybackupserver//path/to/backups/$(basename $repopath)
done
which pushes all local repos to off site backups. In theory you could do both push and pull and if necessary a init/clone, but you'll start running into edge ccases pretty quickly.
来源:https://stackoverflow.com/questions/11378106/synchronizing-a-collection-of-mercurial-repositories