0

I would like to improve the loading speed of the Shadowbox popup images on this page

Basically, all images opened by Shadowbox are linked to from this page:

<a href="images/illustration/garden1.jpg" class="garden"></a>
<a href="images/illustration/garden2.jpg" class="garden"></a>

etc etc.

I know how to preload images by listing them like this:

var images = [ 'image1.jpg', 'image2.jpg', ]

$.fn.preload = function() {
    this.each(function(){
        $('<img/>')[0].src = this;
    });
}

$(document).ready(function(){
 $(images).preload();
 });

Is there a way to pull all the href values into the preload array? If so, how might I exclude links to other documents? Or am I going about this all wrong?!

Thanks.

Caroline Elisa
  • 1,246
  • 6
  • 18
  • 35

2 Answers2

0

The below file will find all of the images found in the selected folder (inputted into the listImages function call). Save this as all_images.php in the directory you wish to crawl for images:

<?php
function listImages($dir){
    $ffs = scandir($dir);
    foreach($ffs as $ff){
        if($ff != '.' && $ff != '..' && strstr($ff, '.png') || strstr($ff, '.jpg') || strstr($ff, '.gif') || strstr($ff, '.jpeg')){
            echo '"/images/'.$ff;
            if(is_dir($dir.'/'.$ff)) listFolderFiles($dir.'/'.$ff);
            echo '", ';
        }
    }
}
echo '[ ';
listImages('images');
echo ']';
// output: ["image1.png", "image2.png"] etc.
?>

Then, to preload these images, load the all_images.php file into a variable and preload the generated string of image files it contains:

<script src="jquery.js"></script>
<script type="text/javascript">
// put contents of all_images.php file into variable 'images'
$.get('/all_images.php', function (data) {
    images = $(data);
});

// The variable 'images' now contains string of file names
// needed to be preloaded.
$.fn.preload = function() {
    this.each(function(){
        $('<img/>')[0].src = this;
    });
}

$(document).ready(function(){
    $(images).preload();
});
</script>

This solution requires no extra 'include' files (unlike my previous answer).

Ryan Brodie
  • 6,554
  • 8
  • 40
  • 57
  • Thanks Ryan! I am just not sure if this can work for me, as I need to keep the classes on the image links so that they open as separate Shadowbox galleries. Hmmm... – Caroline Elisa Apr 16 '12 at 23:48
  • You still can? You're not displaying that page to the users, you're merely generating it so that it can be downloaded in the background into a hidden Div so when the user then clicks on an image and it opens in full as a Shadowbox gallery, that image has already been downloaded and therefore cached. If you're new to PHP I apologise for the lack of explanation to the answer, I'll alter it so it so it's compatible with your original preloading method, hopefully it'll make more sense that way. – Ryan Brodie Apr 17 '12 at 12:55
  • Oh, thanks again Ryan! I thought that your method would replace my list of hidden links at the bottom of the page. – Caroline Elisa Apr 17 '12 at 13:00
  • Thanks for elaborating Ryan! Does it matter that the php file outputs `'/images/'` (i.e. the parent directory) in the string? Also, is there a way to select the content of a subfolder, or at least multiple folders, or should I move all images into the same folder? – Caroline Elisa Apr 18 '12 at 19:54
  • Updated the answer to a much better solution, now will only echo actual images and also includes those found in subfolders, like /illustrations/ etc. – Ryan Brodie Apr 20 '12 at 16:05
  • (i.e. you don't have to move any images). – Ryan Brodie Apr 20 '12 at 16:05
  • Sounds great Ryan! But I am getting an empty set of brackets at http://carolineelisa.com/images/all_images.php (Note I put the file in the images subdirectory) – Caroline Elisa Apr 20 '12 at 19:27
  • The file has to be outside of the intended folder, try specifying the /illustrations folder and you'll see the result. – Ryan Brodie Apr 21 '12 at 00:35
  • Is there an issue with it being placed in your home directory? Use the 'robots no follow' or /disallow robots.txt to stop Google crawling it. – Ryan Brodie Apr 21 '12 at 00:36
  • I tried it in the home directory, and still got `[]`. Also, the home directory has various subdirectories containing other websites, so I need to use it in the `/images/` subdirectory. – Caroline Elisa Apr 21 '12 at 14:24
  • It's because the script, if you looked closely how it worked, was looking for .png's, .jpg's and .jpeg's. I've modified it to now find .gif's as well. Have another go, should work now :) – Ryan Brodie May 15 '12 at 09:56
  • Great! It doesn't search subfolders, but I can dump all my images in one folder if need be. One last question... Does the output not need the full image path? as in 'images/image.jpg' rather than just 'image.jpg' in order to load? – Caroline Elisa May 15 '12 at 13:35
  • Yes sorry it does need to be the full path, I'll update the answer. – Ryan Brodie May 16 '12 at 00:43
  • Thanks Ryan! I am about to accept you answer. But would you mind removing the bit about subfolders, as `all_images.php` only outputs images in the `images/` directory and not its subfolders. I will drop all images in there for simplicity... – Caroline Elisa May 16 '12 at 01:40
  • It was scanning subfolders in my testing. Not to worry, have removed it from the answer. – Ryan Brodie May 17 '12 at 16:49
  • On a random note, I just realised you're not getting your static elements, like images, from a CDN? Using something like Amazon's S3 would likely increase the load times dramatically. – Ryan Brodie May 29 '12 at 01:02
0

Here's a pure JQuery solution using a wildcard selector

$(function(){
  $('a[href$=".jpg"]').each(function(index, elem){
    var href = $(elem).attr('href');
    $('<img/>')[0].src = href;
  });
});

Basically, the $= selector means 'ends-with', so this finds any link that ends with .jpg and preloads it. If you are using the shadowbox convention, you might want to change this selector to $('a[rel^="shadowbox"]') to be a bit more specific. ^= means 'starts-with'.

andyvanee
  • 958
  • 9
  • 21