We have several Python 2.6 applications running on Linux. Some of them are Pylons web applications, others are simply long-running processes that we run from the command lin
I would use rsync to synchronize outwards from your production "prime" server to the others, and from your "beta test" platform, to your production "prime" server.
rsync has the benefit of copying only those files which changed, and copying only parts of files that changed partially, and verifying the integrity and identical content at the end on all machines. An update that gets part way through and is interrupted can easily be continued later, making your deployment more robust.
Subversion or Mercurial would not be a bad idea in this case either. Mercurial has the advantage of allowing you to "pull" or "push" instead of just updating from one central source. You might find interesting cases where a decentralized model (mercurial) works better.