download list of images from urls

自闭症网瘾萝莉.ら 提交于 2019-12-04 05:43:10

问题


I need to find (preferably) or build an app for a lot of images. Each image has a distinct URL. There are many thousands, so doing it manually is a huge effort. The list is currently in an csv file. (It is essentially a list of products, each with identifying info (name, brand, barcode, etc) and a link to a product image. I'd like to loop through the list, and download each image file. Ideally I'd like to rename each one - something like barcode.jpg. I've looked at a number of image scrapers, but haven't found one that works quite this way. Very appreciative of any leads to the right tool, or ideas...


回答1:


Are you on Windows or Mac/Linux? In Windows you can use a powershell script for this, on mac/linux a shell script with about 1-5 lines of code.

Here's one way to do this:

# show what's inside the file
cat urlsofproducts.csv

http://bit.ly/noexist/obj101.jpg, screwdriver, blackndecker
http://bit.ly/noexist/obj102.jpg, screwdriver, acme

# this one-liner will GENERATE one download-command per item, but will not execute them
perl -MFile::Basename -F", " -anlE "say qq(wget -q \$F[0] -O '\$F[1]--\$F[2]--).  basename(\$F[0]) .q(')" urlsofproducts.csv 



# Output :
wget http://bit.ly/noexist/obj101.jpg -O ' screwdriver-- blackndecker--obj101.jpg'
wget http://bit.ly/noexist/obj101.jpg -O ' screwdriver-- acme--obj101.jpg'

Now back-substitute the wget commands into the shell.



来源:https://stackoverflow.com/questions/29551419/download-list-of-images-from-urls

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!