I need to get all of the images from one website that are contained all in one folder. Like for instance, (site.com/images/.*). Is this possible? If so, whats the best way?<
if the site allows indexing, all you need to do is wget -r --no-parent http://site.com/images/
Do you have FTP access?
Do you have shell access?
With Linux it's pretty easy. Not sure about windows.
wget -H -r --level=1 -k -p http://example.com/path/to/images
Edit: Just found wget for windows.
Edit 2: I just saw the PHP tag, in order to create a PHP script which downloads all images in one go, you will have to create a zip (or equivalent) archive and send that with the correct headers. Here is how to zip a folder in php, it wouldn't be hard to extract only the images in that folder, just edit the code given to say something like:
foreach ($iterator as $key=>$value) {
if (!is_dir($key)) {
$file = basename($key);
list($name, $ext) = explode('.', $key);
switch ($ext) {
case "png":
case "gif":
case "jpg":
$zip->addFile(realpath($key), $key) or die ("ERROR: Could not add file: $key");
break;
}
}
}
Depends if the images directory allows listing the contents. If it does, great, otherwise you would need to spider a website in order to find all the image reference to that directory.
In either case, take a look at wget.
If you want to see the images a web page is using: if you are using Chrome, you can just press F-12 (or find Developer Tools in the menu) and on the Resources tab, there's a tree on the left, and then under Frames, you will see the Images folder, then you can see all the images the page uses listed in there.
Have a look at HTTrack software. It can download whole sites. Give website address site.com/images/
and it will download everything in this directory. (if the directory access is not restricted by owner)