WGET - Simultaneous connections are SLOW

匆匆过客 提交于 2019-12-12 22:20:46

问题


I use the following command to append the browser's response from list of URLs into an according output file:

wget -i /Applications/MAMP/htdocs/data/urls.txt -O - \
     >> /Applications/MAMP/htdocs/data/export.txt

This works fine and when finished it says:

Total wall clock time: 1h 49m 32s
Downloaded: 9999 files, 3.5M in 0.3s (28.5 MB/s)

In order to speed this up I used:

cat /Applications/MAMP/htdocs/data/urls.txt | \
   tr -d '\r' | \
   xargs -P 10 $(which wget) -i - -O - \
   >> /Applications/MAMP/htdocs/data/export.txt

Which opens simultaneous connections making it a little faster:

Total wall clock time: 1h 40m 10s
Downloaded: 3943 files, 8.5M in 0.3s (28.5 MB/s)

As you can see, it somehow omits more than half of the files and takes approx. the same time to finish. I cannot guess why. What I want to do here is download 10 files at once (parallel processing) using xargs and jump to the next URL when the ‘STDOUT’ is finished. Am I missing something or can this be done elsewise?

On the other hand, can someone tell me what the limit that can be set is regarding the connections? It would really help to know how many connections my processor can handle without slowing down my system too much and even avoid some type of SYSTEM FAILURE.

My API Rate-Limiting is as follows:

Number of requests per minute 100

Number of mapping jobs in a single request 100

Total number of mapping jobs per minute 10,000


回答1:


A few things:

  • I don't think you need the tr, unless there's something weird about your input file. xargs expects one item per line.
  • man xargs advises you to "Use the -n option with -P; otherwise chances are that only one exec will be done."
  • You are using wget -i - telling wget to read URLs from stdin. But xargs will be supplying the URLs as parameters to wget.
  • To debug, substitute echo for wget and check how it's batching the parameters

So this should work:

 cat urls.txt | \
 xargs --max-procs=10 --max-args=100 wget --output-document=- 

(I've preferred long params - --max-procs is -P. --max-args is -n)

See wget download with multiple simultaneous connections for alternative ways of doing the same thing, including GNU parallel and some dedicated multi-threading HTTP clients.

However, in most circumstances I would not expect parallelising to significantly increase your download rate.

In a typical use case, the bottleneck is likely to be your network link to the server. During a single-threaded download, you would expect to saturate the slowest link in that route. You may get very slight gains with two threads, because one thread can be downloading while the other is sending requests. But this will be a marginal gain.

So this approach is only likely to be worthwhile if you're fetching from multiple servers, and the slowest link in the route to some servers is not at the client end.




回答2:


Have you tried GNU Parallel? It will be something like this:

parallel -a /Applications/MAMP/htdocs/data/urls.txt wget -O - > result.txt

You can use this to see what it will do without actually doing anything:

parallel --dry-run ...

And either of these to see progress:

parallel --progress ...
parallel --bar ...

As your input file seems to be a bit of a mess, you can strip carriage returns like this:

tr -d '\r' < /Applications/MAMP/htdocs/data/urls.txt | parallel wget {} -O - > result.txt


来源:https://stackoverflow.com/questions/45217605/wget-simultaneous-connections-are-slow

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!