I need to get all of the images from one website that are contained all in one folder. Like for instance, (site.com/images/.*). Is this possible? If so, whats the best way?
5 Answers
Do you have FTP access?
Do you have shell access?
With Linux it's pretty easy. Not sure about windows.
wget -H -r --level=1 -k -p http://example.com/path/to/images
Edit: Just found wget for windows.
Edit 2: I just saw the PHP tag, in order to create a PHP script which downloads all images in one go, you will have to create a zip (or equivalent) archive and send that with the correct headers. Here is how to zip a folder in php, it wouldn't be hard to extract only the images in that folder, just edit the code given to say something like:
foreach ($iterator as $key=>$value) {
if (!is_dir($key)) {
$file = basename($key);
list($name, $ext) = explode('.', $key);
switch ($ext) {
case "png":
case "gif":
case "jpg":
$zip->addFile(realpath($key), $key) or die ("ERROR: Could not add file: $key");
break;
}
}
}

- 2,809
- 2
- 28
- 34
Have a look at HTTrack software. It can download whole sites. Give website address site.com/images/
and it will download everything in this directory. (if the directory access is not restricted by owner)

- 5,994
- 7
- 46
- 69
-
1Thank you so much. You save me, friend (y) – Smile Oct 08 '16 at 04:20
-
1Works very well with wget -r --no-parent --no-check-certificate https://xxxxx.png. I add to add "--no-check-certificate". – Quidam Sep 29 '20 at 18:03
if the site allows indexing, all you need to do is wget -r --no-parent http://site.com/images/

- 11,524
- 3
- 24
- 32
If you want to see the images a web page is using: if you are using Chrome, you can just press F-12 (or find Developer Tools in the menu) and on the Resources tab, there's a tree on the left, and then under Frames, you will see the Images folder, then you can see all the images the page uses listed in there.

- 48,840
- 22
- 240
- 204