wget or curl from stdin

陌路散爱 提交于 2019-11-30 08:26:27

What you need to use is xargs. E.g.

tail -f 1.log | xargs -n1 wget -O - -q

You can do this with cURL, but your input needs to be properly formatted. Example alfa.txt:

url example.com
output example.htm
url stackoverflow.com
output stackoverflow.htm

Alternate example:

url stackoverflow.com/questions
remote-name
url stackoverflow.com/documentation
remote-name

Example command:

cat alfa.txt | curl -K-

Use xargs which converts stdin to argument.

tail 1.log | xargs -L 1 wget

Try piping the tail -f through python -c $'import pycurl;c=pycurl.Curl()\nwhile True: c.setopt(pycurl.URL,raw_input().strip()),c.perform()'

This gets curl (well, you probably meant the command-line curl and I'm calling it as a library from a Python one-liner, but it's still curl) to fetch each URL immediately, while still taking advantage of keeping the socket to the server open if you're requesting multiple URLs from the same server in sequence. It's not completely robust though: if one of your URLs is duff, the whole command will fail (you might want to make it a proper Python script and add try / except to handle this), and there's also the small detail that it will throw EOFError on EOF (but I'm assuming that's not important if you're using tail -f).

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!