Parallel tar with split for large folders
问题 I am have really huge folder I would like to gzip and split them for archive: #!/bin/bash dir=$1 name=$2 size=32000m tar -czf /dev/stdout ${dir} | split -a 5 -d -b $size - ${name} Are there way to speed up this with gnu parallel? thanks. 回答1: It seems the best tool for parallel gzip compression is pigz. See the comparisons. With it you can have a command like this: tar -c "${dir}" | pigz -c | split -a 5 -d -b "${size}" - "${name}" With its option -p you could also specify the number of