git-gc

After a git reset, unreachable commit not removed

╄→尐↘猪︶ㄣ 提交于 2021-02-07 19:17:54
问题 I have a small repo that has a couple of commits: * a0fc4f8 (HEAD -> testbranch) added file.txt * e6e6a8b (master) hello world now * f308f53 Made it echo * f705657 Added hello * 08a2de3 (tag: initial) initial Also: $ git status On branch testbranch nothing to commit, working directory clean I can not understand the following behavior. On this state I run: $ git reset initial I see now: * e6e6a8b (master) hello world now * f308f53 Made it echo * f705657 Added hello * 08a2de3 (HEAD ->

Why git operations becomes slow when repo gets bigger

六眼飞鱼酱① 提交于 2020-01-24 11:53:29
问题 I know git would get slow when the repo get bigger. But why? As git stores files as separate directories and files under .git , i can not find out why the operations get slowwer. Let's have a look at the commit operation. Recently, i cloned the webkit repo and i branch from master, then i commit a 2k file to the branch. But i feel it gets slowwer than that i do on my small repo. Because i have not read through git source code, i guess the commit operation compromises storing the file to the

Forcing Remote Repo to Compress (GC) with Git

ぃ、小莉子 提交于 2019-12-18 04:06:09
问题 I'm using Git to version a series of binary files. They compress pretty well, but my central repos do not seem to be compressing when I push to them. They're eating up a decent amount of my quota, so I was looking to see if there was a way to force the remote repo to do a GC. Is this possible? I'm working on Project Locker so I don't believe I have SSH access to go in and GC the repo myself. Any ideas? Thanks. 回答1: If you can't run git gc yourself, you're going to have to trick it into

Forcing Remote Repo to Compress (GC) with Git

佐手、 提交于 2019-12-18 04:06:07
问题 I'm using Git to version a series of binary files. They compress pretty well, but my central repos do not seem to be compressing when I push to them. They're eating up a decent amount of my quota, so I was looking to see if there was a way to force the remote repo to do a GC. Is this possible? I'm working on Project Locker so I don't believe I have SSH access to go in and GC the repo myself. Any ideas? Thanks. 回答1: If you can't run git gc yourself, you're going to have to trick it into

How to skip “Loose Object” popup when running 'git gui'

假如想象 提交于 2019-12-17 10:13:21
问题 When I run 'git gui' I get a popup that says This repository currently has approximately 1500 loose objects. It then suggests compressing the database. I've done this before, and it reduces the loose objects to about 250, but that doesn't suppress the popup. Compressing again doesn't change the number of loose objects. Our current workflow requires significant use of 'rebase' as we are transitioning from Perforce, and Perforce is still the canonical SCM. Once Git is the canonical SCM, we will

List of all commands that cause git gc --auto

淺唱寂寞╮ 提交于 2019-12-17 07:40:08
问题 Is there a definitive list of commands anywhere that cause git gc --auto to run? The git-gc(1) man page simply states: --auto With this option, git gc checks whether any housekeeping is required; if not, it exits without performing any work. Some git commands run git gc --auto after performing operations that could create many loose objects. (emphasis added) I'm in the process of organising a large migration from SVN to Git. The overwhelming majority of users will be on Windows PCs, and a not

Is there a way to limit the amount of memory that “git gc” uses?

◇◆丶佛笑我妖孽 提交于 2019-12-17 07:12:54
问题 I'm hosting a git repo on a shared host. My repo necessarily has a couple of very large files in it, and every time I try to run "git gc" on the repo now, my process gets killed by the shared hosting provider for using too much memory. Is there a way to limit the amount of memory that git gc can consume? My hope would be that it can trade memory usage for speed and just take a little longer to do its work. 回答1: Yes, have a look at the help page for git config and look at the pack.* options,

Is there a way to limit the amount of memory that “git gc” uses?

折月煮酒 提交于 2019-12-17 07:12:01
问题 I'm hosting a git repo on a shared host. My repo necessarily has a couple of very large files in it, and every time I try to run "git gc" on the repo now, my process gets killed by the shared hosting provider for using too much memory. Is there a way to limit the amount of memory that git gc can consume? My hope would be that it can trade memory usage for speed and just take a little longer to do its work. 回答1: Yes, have a look at the help page for git config and look at the pack.* options,

How do I fix these Git GC problems?

纵饮孤独 提交于 2019-12-07 07:41:38
问题 I have a recurring issue where my git repo (I think?) will decide it needs to garbage collect. This process takes well over a half hour, and will then trigger on every pull/push operation. Running Git GC manually takes a half hour, but doesn't seem to fix the issue. The only solution I have found is to delete my repo and clone fresh, which is suboptimal for any number of reasons. My git GC operations may be slow because I have set git some memory limits to stop it from crashing out on git GC

Is there any difference between `git gc` and `git repack -ad; git prune`?

前提是你 提交于 2019-12-06 05:27:44
问题 Is there any difference between git gc and git repack -ad; git prune ? If yes, what additional steps will be done by git gc (or vice versa)? Which one is better to use in regard to space optimization or safety? 回答1: Is there any difference between git gc and git repack -ad; git prune ? The difference is that by default git gc is very conservative about what housekeeping tasks are needed. For example, it won't run git repack unless the number of loose objects in the repository is above a