I\'d like to download a web pages while supplying URLs from stdin. Essentially one process continuously produces URLs to stdout/file and I want to pipe them to wget or curl.
What you need to use is xargs. E.g.
tail -f 1.log | xargs -n1 wget -O - -q