2

I have a directory with almost 60 images but in HD quality so theirs size are around 5 ~ 6 MB and load all them in a web page is to much time for server and browser so both hang up. I read this post and this other too and since I'm using PHP 5.4.20 in my server I'll like to use DirectoryIterator and LimitIterator but example leave in the post are not so explicit to me since I don't know how to move forward/backward in this cases. Can any give me some sample code about paginate files in a directory?

UPDATE: show some code

Right now this is how I read files:

function directoryToArray($directory, $recursive) {
    $array_items = array();
    if ($handle = opendir($directory)) {
        while (false !== ($file = readdir($handle))) {
            if ($file != "." && $file != "..") {
                if (is_dir($directory . "/" . $file)) {
                    if ($recursive) {
                        $array_items = array_merge($array_items, directoryToArray($directory . "/" . $file, $recursive));
                    }
                    $file = $directory . "/" . $file;
                    $array_items[] = preg_replace("/\/\//si", "/", $file);
                } else {
                    $file = $directory . "/" . $file;
                    $array_items[] = preg_replace("/\/\//si", "/", $file);
                }
            }
        }
        closedir($handle);
    }
    return $array_items;
}

$images = directoryToArray("images/portfolio/");
for ($i = 0; $i < count($images); $i++) {
    $old_img_name = explode('/', $images[$i]);
    $new_img_name = $old_img_name[0] . "/" . $old_img_name[1] . '/large/' . $old_img_name[2];

    echo '<div class="span4 element">';
    echo '<div class="hover_img">';
    echo '<img src="' . $images[$i] . '" alt="" />';
    echo '<span class="portfolio_zoom"><a href="' . $new_img_name . '" data-rel="prettyPhoto[portfolio1]"></a></span>';
    echo '</div>';
    echo '</div>';
}
Community
  • 1
  • 1
Reynier
  • 2,420
  • 11
  • 51
  • 91
  • Generally, you would want to generate thumbnail for each image and load them, and only load HD images when they are supposed to. – Aristona Oct 01 '13 at 00:45
  • @Aristona did but for some reason load takes to more time, see my edition in a few minutes – Reynier Oct 01 '13 at 00:49

1 Answers1

1

Aristona's absolutely right. You should probably resize the images to an appropriate file-format, quality & size. At the very least if you're trying to make some sort of gallery, you could use something like image magick to make 'thumbnails' for the gallery where clicking on them may take you to the full-quality image.

Image magick is scriptable in a variety of languages to batch process your images and build thumbnails if you want it to run as a process, alternatively from the command line you can do it as a once off, something like what's mentioned here: Batch resize images into new folder using ImageMagick

Community
  • 1
  • 1
pacifist
  • 712
  • 4
  • 13
  • 1
    I did it before, notice in the script I made how I load small images first and link large images but for some reason is to heavy and images (thumbs) sizes are between 40~80 kb or less – Reynier Oct 01 '13 at 00:54
  • hmm ok, I can't help on providing sample code, I don't write any php, but in helping with the load times & thumbnail filesizes: if PNG is appropriate for what you're doing you could try running optiping over all your files. It's a simple tool that tries different compression settings and recompresses with the best compression. For me reduced filesizes over a vast directory by 20% with no loss of quality. – pacifist Oct 09 '13 at 01:35