Parallel download using Curl command line utility

后端 未结 8 682
抹茶落季
抹茶落季 2020-12-13 19:08

I want to download some pages from a website and I did it successfully using curl but I was wondering if somehow curl downloads multiple pages at a

8条回答
  •  被撕碎了的回忆
    2020-12-13 19:25

    Run a limited number of process is easy if your system have commands like pidof or pgrep which, given a process name, return the pids (the count of the pids tell how many are running).

    Something like this:

    #!/bin/sh
    max=4
    running_curl() {
        set -- $(pidof curl)
        echo $#
    }
    while [ $# -gt 0 ]; do
        while [ $(running_curl) -ge $max ] ; do
            sleep 1
        done
        curl "$1" --create-dirs -o "${1##*://}" &
        shift
    done
    

    to call like this:

    script.sh $(for i in `seq 1 10`; do printf "http://example/%s.html " "$i"; done)
    

    The curl line of the script is untested.

提交回复
热议问题