I am a bit worried by what I read here, which kind-of implies that there is no reliable formula to compute px
based on dp
and screen density (and vice versa):
dp : Density-independent Pixels - An abstract unit that is based on the physical density of the screen. These units are relative to a 160 dpi (dots per inch) screen, on which 1dp is roughly equal to 1px. When running on a higher density screen, the number of pixels used to draw 1dp is scaled up by a factor appropriate for the screen's dpi. Likewise, when on a lower density screen, the number of pixels used for 1dp is scaled down. The ratio of dp-to-pixel will change with the screen density, but not necessarily in direct proportion. Using dp units (instead of px units) is a simple solution to making the view dimensions in your layout resize properly for different screen densities. In other words, it provides consistency for the real-world sizes of your UI elements across different devices.
I thought we could always use the following formula (explained here) :
px = dp * (dpi / 160)
However, the testimony in this question seems to highlight a case where specifying sizes in dp
does not guarantee a fixed perceived size across devices.
What is the real meaning of the sentence in bold ?
Do we need to use mm
in order to be sure to keep the same perceived size ??