GZip every file separately

前端 未结 6 1567
囚心锁ツ
囚心锁ツ 2020-12-13 17:40

How can we GZip every file separately?

I don\'t want to have all of the files in a big tar.

相关标签:
6条回答
  • 2020-12-13 17:49

    Try a loop

    $ for file in *; do gzip "$file"; done
    
    0 讨论(0)
  • 2020-12-13 17:56

    If you want to gzip every file recursively, you could use find piped to xargs:

    $ find . -type f -print0 | xargs -0r gzip
    
    0 讨论(0)
  • 2020-12-13 18:05

    You can use gzip *


    Note:

    • This will zip each file individually and DELETE the original.
    • Use -k (--keep) option to keep the original files.
    • This may not work if you have a huge number of files due to limits of the shell
    • To run gzip in parallel see @MarkSetchell's answer below.
    0 讨论(0)
  • 2020-12-13 18:05

    Or, if you have pigz (gzip utility that parallelizes compression over multiple processors and cores)

    pigz *
    
    0 讨论(0)
  • 2020-12-13 18:10

    After seven years, this highly upvoted comment still doesn't have its own full-fledged answer, so I'm promoting it now:

    gzip -r .

    This has two advantages over the currently accepted answer: it works recursively if there are any subdirectories, and it won't fail from Argument list too long if the number of files is very large.

    0 讨论(0)
  • 2020-12-13 18:12

    Easy and very fast answer that will use all your CPU cores in parallel:

    parallel gzip ::: *
    

    GNU Parallel is a fantastic tool that should be used far more in this world where CPUs are only getting more cores rather than more speed. There are loads of examples that we would all do well to take 10 minutes to read... here

    0 讨论(0)
提交回复
热议问题