How to quickly find all git repos under a directory

前端 未结 8 1465
暖寄归人
暖寄归人 2020-12-12 13:14

The following bash script is slow when scanning for .git directories because it looks at every directory. If I have a collection of large repositories it takes a long time f

相关标签:
8条回答
  • 2020-12-12 13:49

    This answer combines the partial answer provided @Greg Barrett with my optimized answer above.

    #!/bin/bash
    
    # Update all git directories below current directory or specified directory
    # Skips directories that contain a file called .ignore
    
    HIGHLIGHT="\e[01;34m"
    NORMAL='\e[00m'
    
    export PATH=${PATH/':./:'/:}
    export PATH=${PATH/':./bin:'/:}
    #echo "$PATH"
    
    DIRS="$( find "$@" -type d \( -execdir test -e {}/.ignore \; -prune \) -o \( -execdir test -d {}/.git \; -prune -print \) )"
    
    echo -e "${HIGHLIGHT}Scanning ${PWD}${NORMAL}"
    for d in $DIRS; do
      cd "$d" > /dev/null
      echo -e "\n${HIGHLIGHT}Updating `pwd`$NORMAL"
      git pull
      cd - > /dev/null
    done
    
    0 讨论(0)
  • 2020-12-12 13:52

    Check out Dennis' answer in this post about find's -prune option:

    How to use '-prune' option of 'find' in sh?

    find . -name .git -type d -prune
    

    Will speed things up a bit, as find won't descend into .git directories, but it still does descend into git repositories, looking for other .git folders. And that 'could' be a costly operation.

    What would be cool is if there was some sort of find lookahead pruning mechanism, where if a folder has a subfolder called .git, then prune on that folder...

    That said, I'm betting your bottleneck is in the network operation 'git pull', and not in the find command, as others have posted in the comments.

    0 讨论(0)
提交回复
热议问题