0

I have implemented a code to detect X and Y co-ordinates of the touch points. In the implementation, I have used eventFilter() to detect the touch events like this:

bool IpcSocket::eventFilter(QObject *,QEvent *event)
{
    qDebug()<<" EVENT TYPE :"<<event->type();

    if(event->type() == QEvent::TouchBegin)
    {
        QList<QTouchEvent::TouchPoint> touchBeginPoints = static_cast<QTouchEvent *>(event)->touchPoints();
        foreach (const QTouchEvent::TouchPoint &touchBeginPoint, touchBeginPoints)
        {
            unsigned int touchX = touchBeginPoint.pos().x();
            unsigned int touchY = touchBeginPoint.pos().y();
            qDebug() << "X :=" << touchX << ", Y :=" << touchY;
        }
    }
    return false;
}

In most cases, whenever the screen is touched, the above eventFilter() function is called with the touch events detected as "TouchBegin" QEvent. But if a popup is opened and if there is a single touch anywhere on the screen, then the above function is called with the touch event detected as a series of "UpdateRequest" QEvents. But I want to get the X and Y touch co-ordinates even with the pop up opened. Please help me with the below queries:

I want to know why the touch event is not detected as "TouchBegin" QEvent if pop up is opened. Also I wanted to know how to detect the touch events as "TouchBegin" QEvent when popup is opened.

folibis
  • 12,048
  • 6
  • 54
  • 97

0 Answers0