Doing parallel processing in bash?

后端 未结 3 1894
清歌不尽
清歌不尽 2020-12-13 07:24

I\'ve thousands of png files which I like to make smaller with pngcrush. I\'ve a simple find .. -exec job, but it\'s sequential. My machine has qui

相关标签:
3条回答
  • 2020-12-13 07:40

    You can use custom find/xargs solutions (see Bart Sas' answer), but when things become more complex you have -at least- two powerful options:

    1. parallel (from package moreutils)
    2. GNU parallel
    0 讨论(0)
  • 2020-12-13 07:42

    You can use xargs to run multiple processes in parallel:

    find /path -print0 | xargs -0 -n 1 -P <nr_procs> sh -c 'pngcrush $1 temp.$$ && mv temp.$$ $1' sh
    

    xargs will read the list of files produced by find (separated by 0 characters (-0)) and run the provided command (sh -c '...' sh) with one parameter at a time (-n 1). xargs will run <nr_procs> (-P <nr_procs>) in parallel.

    0 讨论(0)
  • 2020-12-13 07:45

    With GNU Parallel http://www.gnu.org/software/parallel/ it can be done like:

    find /path -print0 | parallel -0 pngcrush {} {.}.temp '&&' mv {.}.temp {} 
    

    Learn more:

    • Watch the intro video for a quick introduction: https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1
    • Walk through the tutorial (man parallel_tutorial). You command line will love you for it.
    0 讨论(0)
提交回复
热议问题