Parallel tar with split for large folders

徘徊边缘 提交于 2019-12-11 02:35:57

问题


I am have really huge folder I would like to gzip and split them for archive:

#!/bin/bash
dir=$1
name=$2
size=32000m
tar -czf /dev/stdout ${dir} | split -a 5  -d -b $size - ${name}

Are there way to speed up this with gnu parallel? thanks.


回答1:


It seems the best tool for parallel gzip compression is pigz. See the comparisons.

With it you can have a command like this:

tar -c "${dir}" | pigz -c | split -a 5 -d -b "${size}" - "${name}"

With its option -p you could also specify the number of threads to use (default is the number of online processors, or 8 if unknown). See pigz --help or man pigz for more info.

UPDATE

Using GNU parallel you could do something this:

contents=("$dir"/*)
outdir=/somewhere
parallel tar -cvpzf "${outdir}/{}.tar.gz" "$dir/{}" ::: "${contents[@]##*/}"


来源:https://stackoverflow.com/questions/18557195/parallel-tar-with-split-for-large-folders

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!