Abuse of version control

懵懂的女人 提交于 2020-01-02 05:59:21

问题


Is version control suited for a project where content is essentially binary data files ? I am thinking about package that weight something like 10 giga, with a lot of BMP and TGA files.

Can subversion handle something like this ? Is it possible to generate some kind of binary patch that would allow users to download only what was modified. Rsync could be an option, but then there is no going back. I would really like to be able to go back to an earlier version easily.

I looked at this question too, but was not satisfied with the answer


回答1:


You issue is a release management one which includes:

  • building: how and how fast are you able to regenerate some or all of the delivery content ?
  • packaging: how many files are present in that delivery ?
    if your content includes too many files, it will simply not be easy to deploy (i.e. copy or rsync) in any remote environment, not so much because of the global size, but because of the number of transactions needed.
  • publishing: where do you store your delivery and how to you link it to the initial development environment that produced it ?

I would argue that such a massive delivery is not made to be published in a VCS, but rather store in a filesystem-based repository, with a proper name (or version.txt) to be able to identify its version and link it back to the development content (stored and tagged in subversion).
Maven is an example of such a repo.

I would also point out a content made to be delivered should include a limited number of files, which means:

  • compressed lots of related files together into one compressed file
  • run a script which does not just rsynch, but also un-compressed those files



回答2:


Subversion uses xdelta for binary files.

http://subversion.tigris.org/faq.html#binary-files

BTW. related question: How good is Subversion at storing lots of binary files?




回答3:


Subversion only sends the differences over the line, not the entire files, when doing updates. However the initial checkout of the files DO require a download of all the files. Which will basically mean download 10GB. Also binary files are a nightmare to merge so as long as you work in a master / slave environment where only 1 person can commit and the others are slaves who only update the files this will work very well. Otherwise you're likely to end up with conflict after conflict.

Is it not possible to split the 10GB over multiple repositories ? Do they really need to be versioned as a whole ?




回答4:


The short answer is yes.

We used subversion for a relatively large (40GB checkout) game development project. I will say it handled binaries surprisingly well. The downside is for now you will only get text information for changes, ie: "Changed texture to fit updated main character model." But even this little bit of information can save you when you're looking for performance issues and simply making sure that every one is using the same binary files for development. Patching, as far as I know, would require the full file.




回答5:


You might want to look at some dedicated asset managing system, instead of trying to violently bend a source versioning system into your needs. The only one I've heard of (but have no experience nor affiliation with) is http://www.alienbrain.com/ - and it co$t$.



来源:https://stackoverflow.com/questions/704897/abuse-of-version-control

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!