My application runs in a phone & uses the Presentation
Display to render a scene.
The phone is a Samsung Galaxy S4 around 480dpi's.
I connect a 40" tv with a MHL/HDMI adapter. As seen in LogCat those are its metrics:
{"HDMI Screen": 1920 x 1080, 60.000004 fps, density 320, 320.0 x 320.0 dpi, touch EXTERNAL, rotation 0, type HDMI, FLAG_SECURE, FLAG_SUPPORTS_PROTECTED_BUFFERS, FLAG_PRESENTATION}
My problem is, the dpi's of that secondary display are obviously wrong! The density is much, much lower than 320dpi. I wonder where this value comes from, and if it is adjustable. I'd like to use a value in the mdpi land so the font is rendered smaller.