3

My question is pretty much about page reload rather than retina detection. I'm using the code below in head tag for retina display :

<script type="text/javascript">
               if( document.cookie.indexOf('device_pixel_ratio') == -1
                  && 'devicePixelRatio' in window
                  && window.devicePixelRatio == 2 ){

                var date = new Date();
                date.setTime( date.getTime() + 3600000000 );

                document.cookie = 'device_pixel_ratio=' + window.devicePixelRatio;
                //if cookies are not blocked, reload the page
                if(document.cookie.indexOf('device_pixel_ratio') != -1) {
                    window.location.reload(true);
                }
              }
</script>

It's detecting the visitor's screen type, then storing the cookie and reload if it's a retina display. It's working without an issue except one.

Reload function doesn't stop page rendering. Because of this, you see unstyled page on the first visit (half loaded) then refresh comes. Afterwards, everything is fine for the rest of the site since cookie is stored.

So obviously, PHP doesn't provide a function to detect screen type. It must be done through JS. But JS doesn't have the proper tool for reloading / redirecting without loading the page, even code is being used in head.

In short, I'm stuck in between. I hope someone may have a suggestion for reloading without showing any content on first load (it's even displaying inline styling that I put in head). Or displaying standard images on the first load and allowing retina for the rest of browsing experience is the best option?

By the way I tried both reload(true) and reload(false). It didn't help.


UPDATE : Please see Marat Tanalin's answer below for possible workarounds and better solutions on retina / HD display image use rather than storing cookie + reloading.

After I went deeper on this, I realized that generating both retina and standard images may not be the answer all time due to *caching methods. In other words, displaying low quality images to visitor on first visit and displaying high quality images after refresh may not work since low quality images (and scripts) already cached.

I decided to go with 1.5x single image size with SVG upload support. Although it's not 100% best answer on every aspect, much better option then losing reliability.

*I'm a Wordpress developer and I refer WP Super Cache and similar cache methods but be aware of it may be an issue on other cache methods as well.

xrisk
  • 3,790
  • 22
  • 45
Mertafor
  • 374
  • 4
  • 17

2 Answers2

4

Essentially, you want to stop the page from rendering till you have run this script.

This can be done using an external JS file. When the browser finds an external file, it will wait till this is run. On the other hand if you have an inline <script> tag, then the browser will not wait.

Actually, it is recommended to never block page rendering, to improve page load times, but for this purpose it is necessary. For more information, you can consult https://developers.google.com/speed/docs/insights/BlockingJS

xrisk
  • 3,790
  • 22
  • 45
  • One question though. Page loading time issue only affects the client side but not Google or other bots I suppose? Otherwise this solution (reloading I mean) may cause a mess on search engines, am I correct? – Mertafor Jun 20 '15 at 02:06
  • 1
    I do not think Google pays any attention to script blocks, but I am not sure. I think that Google only looks at the raw HTML, not does not parse any javascript, but again I am not sure. You may ask another question on SO if you want. – xrisk Jun 20 '15 at 02:08
  • 1
    @Mertafor It turns out that Google does execute Javascript but I don’t know how it handles redirects by Javascript. One _potential_ solution to avoid Googlebot making a mess is to add this script file in your robots.txt so that Google does not access it. – xrisk Jun 20 '15 at 02:15
  • 1
    http://googlewebmastercentral.blogspot.in/2014/05/understanding-web-pages-better.html – xrisk Jun 20 '15 at 02:15
  • 1
    Apparently I should look into that. Thanks again for following up on this. – Mertafor Jun 20 '15 at 02:17
2

Instead of using pure-JS solution, it's generally better practice to use CSS for styling, native HTML elements for content images, and unobtrusive JavaScript if needed. Unobtrusive JavaScript means that webpage does work even if JS is disabled.

In CSS, just use Media Queries.

For responsive images in HTML, there is the standard PICTURE element, but it's not yet widely supported, so a polyfill like picturefill may be used.

Another possible approaches for responsive images:

  • using high-resolution images always (regardless of actual pixel density; will somewhat waste traffic and provide lower image quality for users of monitors with low pixel density);

  • or using JavaScript to replace low-resolution images with their high-resolution versions on page load. To prevent loading low-res versions prior to high-res ones, low-res ones may be put into NOSCRIPT element which contents are then read dynamically and image source is extracted, modified, and NOSCRIPT element is replaced with hires image via JS.

For photographic images, an acceptable tradeoff may be using 1.5x images. Be careful with lineart images (like logos or schemes) though: they typically look best at 1:1 scale (when one image pixel exactly corresponds to one physical display pixel) and there may be evident quality loss when scaled with blur.

Also, consider using vector images in SVG format when possible — they scale without quality loss regardless of actual pixel density; though, on monitors with regular pixel density (e.g. popular Full-HD monitors compared with 4K monitors), they may have noticeably lower visual quality compared with pixel-perfect 1:1 raster images.

Marat Tanalin
  • 13,927
  • 1
  • 36
  • 52
  • I haven't mentioned in the question since it may have been irrelevant, but I'm working on a Wordpress theme and it brings several limitations on brand new techniques. The options you advise look definitely pretty smooth and although I'm a bit skeptical on NOSCRIPT approach, it's still a better option that page reload. I'm working on a way using data-retina and src on img tag (as I mentioned above) to overcome this issue for now. But it's great to know PICTURE and picturefill techniques, thanks for that. – Mertafor Jun 20 '15 at 03:08
  • One thing bothers me at the most. Instead of creating low and high res images, how about creating only hi res? I mean it affectspage loading time as you mentioned but also it affects Google bot, page speed etc. I suppose. Chrome's Page Speed immediately takes between 20-40 percent out of page score just because larger images. On the other hand I see some web services directly use full size images without even care. My question is simply, am I doing the right thing by using 2 versions as hi-res / low-res or using one hi-res is an acceptable method as well? – Mertafor Jun 20 '15 at 03:13
  • 1
    @Mertafor `NOSCRIPT` approach (probably invented by me :-) does work quite fine: if JS is disabled, the user will receive at least low-res image anyway, while if JS is enabled, user will receive high-res image without also downloading low-res one before the high-res one. – Marat Tanalin Jun 20 '15 at 16:16
  • 1
    @Mertafor As for what exact approach is best, it actually depends on each specific case. Sometimes it's reasonable just to use high-res images only (e.g. if a project budget is not enough to account for both low-res and high-res variants), sometimes it's reasonable to use both (so that users of low-res monitors would see sharper low-res images displayed at 1:1 scale, than slightly blurred high-res images scaled by browser with some quality loss). It's up to you to decide whether a specific approach is more reasonable than another in your case. – Marat Tanalin Jun 20 '15 at 16:21
  • I appreciate detailed explanation @Marat. After I spent my whole night with this, I realized there is no best way "yet". While Picturefill JS looks like too much effort for embedding an image, `Picture` is not widely supported yet. I was almost going to open a new question (maybe I should) but what do you think about 1.5x instead of 2x? I performed couple of tests and I can't see the quality difference and "one image" for all looks acceptable while is not the best solution though. – Mertafor Jun 20 '15 at 16:34
  • By the way, cudos for `NOSCRIPT` and handling the situation like a boss. On the other hand, in my humble opinion if someone using his browser's JS disabled, seeing low quality images should be the least of his problems :) – Mertafor Jun 20 '15 at 16:37
  • Superb. I have to keep @rishav's answer accepted because of question / answer relation but I updated my question for your helpful answer. – Mertafor Jun 20 '15 at 18:04
  • It's OK. As for caching, different versions of the same image should, of course, have different URLs, so client-side caching is not an issue. For example, 2x images usually have `-2x` or `@2x` postfix in filename before the extension: `example.png` (1x version), `example-2x.png` (2x version). – Marat Tanalin Jun 20 '15 at 18:21
  • Absolutely. However Wordpress caching plugins somehow taking a snapshot of current page, saving as HTML file and no chance to update its parts afterwards (except for certain scripts / cookies). Although it's maximizing the page loading speed, terrible for certain elements. I just wanted to mention that in case it might help someone dealing with caching scripts / plugins. – Mertafor Jun 20 '15 at 18:38