We have a Qt GUI application being started on boot through registry Run/RunOnce-Key.
When started on boot the Qt application gets position from touch screen but does not get the click event (emulated mouse click on button) itself.
When run manually the Qt application gets position and click from touch screen.
When using the mouse both version work fine. Other applications started on boot do accept the click through touch.
We do not have any TouchEvents implemented, the touchscreen is just interpreted as mouse events. I guess the application is just to quick and starts before the touchscreen driver being completly loaded and somehow bugs the "click" event then. But i don't know how to validate this or how to look for the problem at all.
We run different systems where this problem does not occur, its just this one windows pc we have the trouble with now. Changing the pc is an option for removing the problem, but i still want to find the source of the problem to make sure this won't happen again. Currently its one out of 100 systems having this problem.
I wrote a small additional application installing an eventFilter to qApp to see what events actually arrive. When run on boot i only receive mouse-move events, while started manually later i receive move press release events.
I wonder if someone else has encountered similar touch problems with Qt/"clickapplications" and touchscreens itself.
bool MouseFilter::eventFilter(QObject *o, QEvent *e)
{
if(e->type() == QEvent::MouseMove)
{
emit signalMouseMove();
return true;
}else if(e->type() == QEvent::MouseButtonRelease) {
emit signalMouseRelease();
return true;
}else if(e->type() == QEvent::MouseButtonPress) {
emit signalMousePress();
return true;
}else if(e->type() == QEvent::MouseButtonDblClick) {
emit signalMouseDoubleClick();
return true;
}
return QObject::eventFilter(o,e);
}
MouseFilter *mf = new MouseFilter();
qApp->installEventFilter(mf);