0

I use as font-family stack like this:

body{
  font-family: icon, 'Merriweather Sans', ui-sans-serif, sans-serif;
}

with icon being an icon-font that has some symbols in it for example arrows, which can occur in copy text or headlines. The icon font is loaded like this (simplified example):

@font-face {
  font-family: 'icon';
  src: url(icon.woff2);
  unicode-range: U+1F872, ..., ...;
}

That way whenever the 'arrow bold to right' is used in a text it will be rendered in the icon font. It works well and is foolproof.

I also use (simplified example again)

p.text{
  max-width: 70ch;
}

which ensures that text paragraphs don't have too many characters in one line for readability.

Firefox makes those text paragraphs a lot smaller than chrome. And after some experimenting, I found that FF uses the icon-font to determine the width of the zero while Chrome uses Merriweather Sans.

The icon font has no zero-glyph in it and would be hindered by its Unicode range to display one. So at first glace it seems correct that Chrome calculates ch based on the first font-family in the stack that contains a zero and is allowed to render a zero.

On the other hand I'd think that for calculating the abstract ch unit one can argue that the actual presence ot the zero character isn't necessary. Also it should be perfectly ok if one would use one font only for displaying number (through Unicode-Range) while the next font displays letters. What font should be used for calculating the ch value then?

Can anyone tell me if one or the other browsers does it wrong and the other does it right?

And has anyone an idea for a good workaround?

iDad
  • 39
  • 4

0 Answers0