I\'d like to download a web pages while supplying URLs from stdin. Essentially one process continuously produces URLs to stdout/file and I want to pipe them to wget or curl.
The effection way is to avoid using xargs, if download files in same web server.
wget -q -N -i - << EOF http://sitename/dir1/file1 http://sitename/dir2/file2 http://sitename/dir3/file3 EOF