18

I am developing a scalable mobile/desktop friendly website using CSS rem units based on the average viewport size of the site.

I have been setting rem sizes as low as 0.001rem for certain padding and margins, which is working fine in my Firefox browser... but will it work in all modern browsers?

I am only questioning the ability to use 0.001rem units because the highest granularity I have seen prior to thousandths is hundredths of opacity... like opacity:0.25 for example.

How low can rem units go? Is 0.00000001rem an acceptable value in modern browsers?

BoltClock
  • 700,868
  • 160
  • 1,392
  • 1,356
darkAsPitch
  • 1,855
  • 4
  • 23
  • 35
  • 2
    https://www.w3.org/TR/css3-values/#numeric-types "CSS theoretically supports infinite precision and infinite ranges for all value types; however in reality implementations have finite capacity. UAs should support reasonably useful ranges and precisions." – Hatchet Feb 18 '16 at 21:42
  • 1
    [This website](http://cruft.io/posts/percentage-calculations-in-ie/) is about percentage rounding and pixel rendering, my guess is that the browsers would round the em/rem units like how they round the percentages. The table on the bottom of the page and the test site might interest you. – ezattilabatyi Feb 18 '16 at 22:01
  • 1
    On a browser with default settings at default zoom, 0.00000001rem is equivalent to 0.00000016 CSS pixels provided the computed root font-size is 16px. How would you begin to represent 0.00000016 CSS pixels on any display in existence today? – BoltClock Feb 19 '16 at 16:10

3 Answers3

15

Ultimately, your granularity is 1 physical pixel (well technically the sub-pixel in modern browsers, but I will ignore that for purposes of this discussion). You can have different calculated pixel values based on em or rem even down to several digits of precision. You then run into the real-world problem of, when rendering, that decimal precision would be lost when the browser ultimately rounds off to fit the pixels available at whatever the device pixel density is relative to the reference pixel density (96ppi).

In essence, this reference pixel is 1/96th of an inch. So 1px in CSS terms basically means 1/96" at 96ppi. On screens with higher pixel densities (say like 326 ppi of many Apple "retina" screens), scaling takes place to convert the CSS reference pixel to physical pixels. For the retina display mentioned, this scaling factor would be ~3.4. So if you specified a CSS rule to set something to say 10px, the retina display browser should display on 34 physical pixels (assuming no other HTML changes (i.e. meta-elements) that would change display behavior). Because of this scaling behavior, the physical size of the element would still be 10/96" which is exactly the same physical size as if the element were rendered on a 96ppi screen.

Now let's add em and rem to the mix. So let's use an example of 10px root element font size with a declaration on some other element of .001rem. That would mean you are trying to render this element at 0.01 (10px * .001rem) reference pixels, which would translate to 0.034 physical pixels in the retina display. You can clearly see that the rem value of 0.001 is at least one order of magnitude away from making a significant difference in physical display, as .01rem in this case would translate to 0.34 physical pixels - no difference when rounded for display than for the "more precise" .001rem specification.

So I think you are defining rem-based CSS rules with far more specificity than can actually be accommodated in real-world terms when physical pixels are being painted, unless you either have a very high root element size defined and/or you have a physical screen with pixel densities an order of magnitude greater than what you have in a retina display. I am guessing this latter case is not true.

Just because the CSS can be calculated to 3 decimals worth of precision or whatever, that doesn't mean that physical rendering can occur at that same level of precision.

Mike Brant
  • 70,514
  • 10
  • 99
  • 103
4

A simple example on JSBin shows that a height of 1.001 rem renders to 18.0156 px, while a height of 1.0001 rem renders 18 px (which would be the same as using just 1 rem).

This means that you can have a 3-decimal accuracy (at least in the desktop version of Chrome and with regular font size).

I was also trying to write a JavaScript test to measure the accuracy, but the element.offsetHeight it an integer, so it's useless for this matter. Unless there is a different way to measure element dimensions in pixels (with decimal places).

EDIT 1: According to CSS specification (see this and this), there seems to be no limit regarding the number of decimal places.

But in the end, I think you are limited by the device's pixel density. The physical pixels on the screen are indivisible and all computed dimensions are therefore rounded to the nearest integer.

alesc
  • 2,776
  • 3
  • 27
  • 45
2

"CSS theoretically supports infinite precision and infinite ranges for all value types; however in reality implementations have finite capacity. UAs should support reasonably useful ranges and precisions." w3.org

In practice, however, my conclusion is: In Chrome, you can get to the thousandths place unlimited precision, until the fractional portion of the pixel value drops below 0.015.

EDIT: Original conclusion found to be flawed once the font size was increased to a larger number.

Test it yourself (I used Element.getBoundingClientRect()):

var l = [
  "normal",
  "tens",
  "hundreds",
  "thousands",
  "ten-thousands",
  "hundred-thousands"
];
var o = document.getElementById("output");

for (var i = 0; i < l.length; i++) {
  o.innerHTML += l[i] + ": " + document.getElementById(l[i]).getBoundingClientRect().height + "px<br>";
}
body,
html {
  font-size: 1000px;
}
#normal {
  height: 1rem;
}
#tens {
  height: 1.1rem;
}
#hundreds {
  height: 1.01rem;
}
#thousands {
  height: 1.001rem;
}
#ten-thousands {
  height: 1.0001rem;
}
#hundred-thousands {
  height: 1.00001rem;
}
#output {
  font-size: 16px;
}
<div id="output"></div>
<br>
<div id="normal">Hello</div>
<div id="tens">Hello</div>
<div id="hundreds">Hello</div>
<div id="thousands">Hello</div>
<div id="ten-thousands">Hello</div>
<div id="hundred-thousands">Hello</div>

My output was:

normal: 1000px
tens: 1100px
hundreds: 1010px
thousands: 1001px
ten-thousands: 1000.09375px
hundred-thousands: 1000px

Indicating that the ten-thousandths place (0.00001) is beyond the precision that my browser supports.

After increasing the font size further, and playing around with it, I can't find a value that will yield a pixel value having a fractional portion < 0.015. (Try setting font-size: 1556px and 1557px)

Hatchet
  • 5,320
  • 1
  • 30
  • 42