-1

I'm trying to embed multitouch events into my wxPython GUI, however i'm a bit lost as to the best approach.

Currently, I have a TUIO server which transmits the multitouch events to be captured. I then use the pytuio library to receive the multitouch events in a separate thread for my GUI. My GUI is composed of a wxFrame with multiple matplotlib panels and a single OpenGL panel.

The problem is that I have had to manually write code to determine how many fingers are being used, the locations and the touch type. I then send a custom event which can be received by my GUI.

This works fine for the matplotlib panels (albeit I have to provide a very small constant offset to the reported location of the fingers), but for the OpenGL panel the finger locations seem to be incorrect. This is a problem as the offset of the touch locations in the OpenGL panel is not even a constant, it seems to vary depending where on the panel the touch event occurs. So I cannot compensate for it.

I feel like there must be a more comprehensive multitouch library, which does all the hard work determining the number of fingers and touch type (tap, double tap, drag, release etc). And possibly would overcome my issue with the OpenGL panel. I have looked but I've not seen a library which can distinguish the touch type etc., they just seem to provide a list of the number of fingers and the locations.

James Elder
  • 1,583
  • 3
  • 22
  • 34

1 Answers1

1

The only comprehensive GUI library supporting:

  • Python
  • More than one OS
  • Multitouch

is Kivy. I was able to cobble together something which works for Windows 7 and higher and wxPython (by extracting the relevant part from Kivy for processing WM_TOUCH events), so in principle it could be done. But none of this would solve your specific problem.

nepix32
  • 3,012
  • 2
  • 14
  • 29