0

I want to place a hardware keyboard on part of the screen of a specific android smartphone (just like the Galaxy S7 edge Keyboard Cover). So i need the android system to always only use the part of the screen which is not occupied by the keyboard, even on full-screen video playback etc. But a service needs to be able to handle the touch events of the area occupied by the keyboard. This does not need to work while booting.

Solutions may use stock android (with root access) or a modified LineageOS.

I tend to believe that there is no solution for stock android.

But the Android open source Project is too complex for me to find a place to start modifying. The window manager service, surfaceflinger, or any other?

My intention is that modifying surfaceflinger would be the most general solution. Is modifying surfaceflinger even possible or is it linked statically with the HAL and part of the binary blob? I expect surfaceflinger not to handle touch events in any way, right?

A vague idea not touching surfaceflinger is to modify the window manager to ask surfaceflinger to create a virtual display with a smaller size as the native one, use this for anything instead of the native one, and blit it to the native one.

Another idea is to modify rectangles of the windows in the window manager. But i don't know whether this is possible (especially for full-screen video playback).

The touch events would need to be modified as well. By the way, does the window manager route the touch events?

Is there any component of android which uses surfaceflinger directly bypassing the window manager and my possible modifications? For example may apps ask surfaceflinger for the screen size/resolution or is this information dispatched by the window manager?

Is there a more simple way?

Any hints?

Pentagolo
  • 121
  • 6

1 Answers1

0

I found a even more simple solution on my own. The window manager of android manages overscan-settings for each display which may be changed by the wm command in a root shell for example, in order to restrict the area of the display used by the system. This works like a charm.

And yes, the window manager routes the input to the top window. But i do not know how and where, cause the WindowManagerService class is a huge mess, i do not want to fiddle with anymore.

The documentation even states, that a touchscreen driver exposing a specific file in /proc (or /sys, i can't remember) containing information about where fixed soft-buttons are located and how they should be mapped to linux key codes would force the system to automatically use them. So using a custom kernel module which creates such an entry in the filesystem will eventually do the trick. But it is untested.

The button presses of the hardware keyboard are handled by a dedicated service only, so i will simply read /dev/input/event directly in this service.

Pentagolo
  • 121
  • 6