I was able to ask Stephen Webb who currently leads the Unity Technical Team about touch input. I contacted him because he used to be the lead developer on the uTouch-geis package which was split into three different packages Frame, Geis, and Grail. These are the primary touch screen controls and he is also heavily involved in other multi-touch projects. I asked him about learning more about touch device input and my idea. Here is what he had to say:
I want to have Grub2 register a single tap event, no more (multi touch not needed-I think). Purpose would be to select from the boot menu.
You're going to have a challenge getting touch to work from boot
loader code.
Many (but not all) touch input devices have device drivers that work
to the Microsoft HID protocol. You'd need to replicate that driver
technology into GRUB2, and then figure out how to map that into
something GRUB2 would understand as input. Sounds like work.
Is there any resources you can lead me to that would help me understand how touch devices work?
That's complicated. There's a lot of different technologies,
connected to the host in different ways, talking different protocols,
delivering different data.
There's some good documentation here.
Are there any single tap libraries you can suggest or multi touch one? Do you have any other suggestions on how to move forward?
Well, in the Linux stack there's the firmware in the touch processor,
which feeds data to the device driver in the kernel, which translates
into the evdev protocol, which is read by the x.org driver and
converted into the XI2 protocol, or if there's nothing looking for XI2
touch events, then converts that into an XI mouse event, and an X11
event gets sent to the client. All you have at the boot loader level
is direct input from the touch processor.
I asked a varient of this question over at askubuntu and posted this answer there aswell.