bash: start multiple chained commands in background

前端 未结 15 911
生来不讨喜
生来不讨喜 2020-12-07 22:47

I\'m trying to run some commands in paralel, in background, using bash. Here\'s what I\'m trying to do:

forloop {
  //this part is actually written in perl
          


        
相关标签:
15条回答
  • 2020-12-07 23:32
    for command in $commands
    do
        "$command" &
    done
    wait
    

    The ampersand at the end of the command runs it in the background, and the wait waits until the background task is completed.

    0 讨论(0)
  • 2020-12-07 23:32

    Just in case that someone is still interested, you can do it without calling a subshell like this:

    print `touch .file1.lock && cp bigfile1 /destination && rm .file1.lock &`;
    
    0 讨论(0)
  • 2020-12-07 23:42

    GavinCattell got the closest (for bash, IMO), but as Mad_Ady pointed out, it would not handle the "lock" files. This should:

    If there are other jobs pending, the wait will wait for those, too. If you need to wait for only the copies, you can accumulate those PIDs and wait for only those. If not, you could delete the 3 lines with "pids" but it's more general.

    In addition, I added checking to avoid the copy altogether:

    pids=
    for file in bigfile*
    do
        # Skip if file is not newer...
        targ=/destination/$(basename "${file}")
        [ "$targ" -nt "$file" ] && continue
    
        # Use a lock file:  ".fileN.lock" for each "bigfileN"
        lock=".${file##*/big}.lock"
        ( touch $lock; cp "$file" "$targ"; rm $lock ) &
        pids="$pids $!"
    done
    wait $pids
    

    Incidentally, it looks like you're copying new files to an FTP repository (or similar). If so, you could consider a copy/rename strategy instead of the lock files (but that's another topic).

    0 讨论(0)
提交回复
热议问题