Download all images from a single directory of a website

后端 未结 5 1037
爱一瞬间的悲伤
爱一瞬间的悲伤 2020-12-09 00:29

I need to get all of the images from one website that are contained all in one folder. Like for instance, (site.com/images/.*). Is this possible? If so, whats the best way?<

相关标签:
5条回答
  • 2020-12-09 01:01

    if the site allows indexing, all you need to do is wget -r --no-parent http://site.com/images/

    0 讨论(0)
  • 2020-12-09 01:10

    Do you have FTP access?

    Do you have shell access?

    With Linux it's pretty easy. Not sure about windows.

    wget -H -r --level=1 -k -p http://example.com/path/to/images
    

    Edit: Just found wget for windows.

    Edit 2: I just saw the PHP tag, in order to create a PHP script which downloads all images in one go, you will have to create a zip (or equivalent) archive and send that with the correct headers. Here is how to zip a folder in php, it wouldn't be hard to extract only the images in that folder, just edit the code given to say something like:

    foreach ($iterator as $key=>$value) {
        if (!is_dir($key)) {
            $file = basename($key);
            list($name, $ext) = explode('.', $key);
            switch ($ext) {
                case "png":
                case "gif":
                case "jpg":
                    $zip->addFile(realpath($key), $key) or die ("ERROR: Could not add file: $key");
                break;
            }
        }
    }
    
    0 讨论(0)
  • 2020-12-09 01:14

    Depends if the images directory allows listing the contents. If it does, great, otherwise you would need to spider a website in order to find all the image reference to that directory.

    In either case, take a look at wget.

    0 讨论(0)
  • 2020-12-09 01:18

    If you want to see the images a web page is using: if you are using Chrome, you can just press F-12 (or find Developer Tools in the menu) and on the Resources tab, there's a tree on the left, and then under Frames, you will see the Images folder, then you can see all the images the page uses listed in there.

    0 讨论(0)
  • 2020-12-09 01:19

    Have a look at HTTrack software. It can download whole sites. Give website address site.com/images/ and it will download everything in this directory. (if the directory access is not restricted by owner)

    0 讨论(0)
提交回复
热议问题