This topic has been discussed before, however I was unable to find anything applicable.
I have a large number of datasets (numbering in the thousands), which I'd like to display and then run through a JS pagination.
That obviously requires me to fetch all the results, which just takes a few milliseconds and is barely noticeable.
However upon constructing the actual page, the images, which are largely recurring, lead to a page loading time of 10 seconds+, which is decidedly too long.
My question is fairly simple: Is there some way to tell the browser to download each picture just once and reuse that instead of apparently downloading the same pictures over and over again?
Or do you know any other tricks to speed up page loading for many datasets with recurring pictures?
I'd like to avoid having to do the pagination via AJAX if possible.
My table structure looks like this:
<table>
<tr>
<td>Number</td>
<td>Image</td>
<td>Image</td>
<td>Text</td>
<td>Text</td>
<td>Text</td>
<td>Text</td>
<td>Image</td>
<td>Text</td>
<td>Image</td>
<td>Image</td>
</tr>
</table>
The last image is the same for all rows, for images one and two it's always one of two different images, four is one of 13 different images.
I hope you can help me out a little.
Edit: Thanks to the previous replies, I managed to get rid of the image loading times, which is per definition great, however it didn't really reduce my loading times all that much.
Firebug tells me, that it wastes most of the time on Waiting
, which gives me an entirely new angle on the problem I can explore.
Thanks for helping out so far, if I get stuck again, I'll open a new question. :)
On a final note, it appears the problem wasn't the server/client transfer connection, but much rather my Firefox is having issues rendering the large block. So yeah...guess I'm out of luck.