Abuse of version control

喜夏-厌秋 提交于 2019-12-05 17:20:54

You issue is a release management one which includes:

  • building: how and how fast are you able to regenerate some or all of the delivery content ?
  • packaging: how many files are present in that delivery ?
    if your content includes too many files, it will simply not be easy to deploy (i.e. copy or rsync) in any remote environment, not so much because of the global size, but because of the number of transactions needed.
  • publishing: where do you store your delivery and how to you link it to the initial development environment that produced it ?

I would argue that such a massive delivery is not made to be published in a VCS, but rather store in a filesystem-based repository, with a proper name (or version.txt) to be able to identify its version and link it back to the development content (stored and tagged in subversion).
Maven is an example of such a repo.

I would also point out a content made to be delivered should include a limited number of files, which means:

  • compressed lots of related files together into one compressed file
  • run a script which does not just rsynch, but also un-compressed those files

Subversion only sends the differences over the line, not the entire files, when doing updates. However the initial checkout of the files DO require a download of all the files. Which will basically mean download 10GB. Also binary files are a nightmare to merge so as long as you work in a master / slave environment where only 1 person can commit and the others are slaves who only update the files this will work very well. Otherwise you're likely to end up with conflict after conflict.

Is it not possible to split the 10GB over multiple repositories ? Do they really need to be versioned as a whole ?

The short answer is yes.

We used subversion for a relatively large (40GB checkout) game development project. I will say it handled binaries surprisingly well. The downside is for now you will only get text information for changes, ie: "Changed texture to fit updated main character model." But even this little bit of information can save you when you're looking for performance issues and simply making sure that every one is using the same binary files for development. Patching, as far as I know, would require the full file.

You might want to look at some dedicated asset managing system, instead of trying to violently bend a source versioning system into your needs. The only one I've heard of (but have no experience nor affiliation with) is http://www.alienbrain.com/ - and it co$t$.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!