bash: start multiple chained commands in background

前端 未结 15 911
生来不讨喜
生来不讨喜 2020-12-07 22:47

I\'m trying to run some commands in paralel, in background, using bash. Here\'s what I\'m trying to do:

forloop {
  //this part is actually written in perl
          


        
相关标签:
15条回答
  • 2020-12-07 23:27

    Thanks Hugh, that did it:

    adrianp@frost:~$ (echo "started"; sleep 15; echo "stopped")
    started
    stopped
    adrianp@frost:~$ (echo "started"; sleep 15; echo "stopped") &
    started
    [1] 7101
    adrianp@frost:~$ stopped
    
    [1]+  Done                    ( echo "started"; sleep 15; echo "stopped" )
    adrianp@frost:~$ 
    

    The other ideas don't work because they start each command in the background, and not the command sequence (which is important in my case!).

    Thank you again!

    0 讨论(0)
  • 2020-12-07 23:28

    Run the command by using an at job:

    # date
    # jue sep 13 12:43:21 CEST 2012
    # at 12:45
    warning: commands will be executed using /bin/sh
    at> command1
    at> command2
    at> ...
    at> CTRL-d
    at> <EOT>
    job 20 at Thu Sep 13 12:45:00 2012
    

    The result will be sent to your account by mail.

    0 讨论(0)
  • 2020-12-07 23:29

    I haven't tested this but how about

    print `(touch .file1.lock; cp bigfile1 /destination; rm .file1.lock;) &`;
    

    The parentheses mean execute in a subshell but that shouldn't hurt.

    0 讨论(0)
  • 2020-12-07 23:29

    run the commands in a subshell:

    (command1 ; command2 ; command3) &
    
    0 讨论(0)
  • 2020-12-07 23:30

    I don't know why nobody replied with the proper solution:

    my @children;
    for (...) {
        ...
        my $child = fork;
        exec "touch .file1.lock; cp bigfile1 /destination; rm .file1.lock;" if $child == 0;
        push @children, $child;
    }
    # and if you want to wait for them to finish,
    waitpid($_) for @children;
    

    This causes Perl to spawn children to run each command, and allows you to wait for all the children to complete before proceeding.

    By the way,

    print `some command`
    

    and

    system "some command"
    

    output the same contents to stdout, but the first has a higher overhead, as Perl has to capture all of "some command"'s output

    0 讨论(0)
  • 2020-12-07 23:31

    I stumbled upon this thread here and decided to put together a code snippet to spawn chained statements as background jobs. I tested this on BASH for Linux, KSH for IBM AIX and Busybox's ASH for Android, so I think it's safe to say it works on any Bourne-like shell.

    processes=0;
    for X in `seq 0 10`; do
       let processes+=1;
       { { echo Job $processes; sleep 3; echo End of job $processes; } & };
       if [[ $processes -eq 5 ]]; then
          wait;
          processes=0;
       fi;
    done;
    

    This code runs a number of background jobs up to a certain limit of concurrent jobs. You can use this, for example, to recompress a lot of gzipped files with xz without having a huge bunch of xz processes eat your entire memory and make your computer throw up: in this case, you use * as the for's list and the batch job would be gzip -cd "$X" | xz -9c > "${X%.gz}.xz".

    0 讨论(0)
提交回复
热议问题