Git lfs - “this exceeds GitHub's file size limit of 100.00 MB”

后端 未结 8 722
野趣味
野趣味 2020-11-29 07:16

I have some csv files that are larger than github\'s file size limit of 100.00 MB. I have been trying to use the Git Large File Storage extension.

https://git-lfs.gi

8条回答
  •  一整个雨季
    2020-11-29 07:38

    I hit the same problem yesterday and cracked it. I was unable to push, and it appeared that none of my big files were in lfs.

    There is probably a better way, but this worked for me. I have a large repo with 2.5 gigs of data.

    I setup a new repo then setup lfs in it. git lfs init

    I then configured my various file types git lfs track "*.pdb" git lfs track "*.dll" I then commmited my changes and pushed.

    I then added my big files. I used sourcetree, and in the output notes it would state for the big files matching my wildcards that it was committing tiny txt file instead. (sorry, I didn't record these, but it should be obvious).

    Then I pushed, and I saw 'skipping files', and the push succeeded quickly.

    so the problem is probably trying to add files to lfs that are already in your history. You can only add new files. You can probably clean your repo of these files.

    Note: I did find that quite a few files that matched my wildcards were not picked up by lfs. Similar files in different folders were picked up, but not all. I tried explicitly adding these files using the full path. git lfs track "Windows/bin/myBigFile.dll" but that didn't help either. In the end I gave up due to time constraints.

    You should also check your storage limit with gitHub. I purchased the extra 50gig to cover my requirements.

    Cloning the repo now downloads the files separately and everything is finally working well.

提交回复
热议问题