Important: Please read the Qt Code of Conduct - https://forum.qt.io/topic/113070/qt-code-of-conduct

How to make widgets react to touches instead of mouse clicks



  • Dear Qt community,

    I am new to Qt and would like to develop a control device based on a Raspberry Pi and an LCD with a touchscreen.

    What I have done so far and what works:

    • Set up a cross-compilation environment with Qt 5.15.0 and Qt Creator 4.12.1;
    • Set up a Raspberry Pi 4B with Raspberry OS Lite (no Desktop);
    • Cross-compilation works, I can run and remote debug a program that runs on the Pi in fullscreen mode (EGLFS), no problem so far;
    • For the touchscreen, tslib (libts0) is used. ts_calibrate and ts_test work;
    • When I connect a keyboard and mouse to the Pi, I can use the program.

    However, a physical keyboard is not needed and I would like to use the touchscreen instead of a mouse (most of the widgets are QPushButtons).

    Touches are translated into mouse events. When I override the mousePressEvent method of the MainWindow, this method is called whenever I touch the screen. So far, so good. However, when I try to "click" a QPushButton by "touching" it, nothing happens.

    What I would like to achieve is that the widgets recognise touches as mouse clicks, i.e. that they behave as when the left mouse button was clicked. How can I do that?

    I have also read that I would need to handle the QMouseEvents that are generated when I touch the screen in the widgets. Does this mean that I have to subclass all widgets that I would like to use (QPushButton, ...) and override the mousePressedEvent of every new widget class or is there a more elegant way? I think that the solution must be either very easy or complicated :-)

    Thank you very much for your help,

    Ralf


  • Moderators

    @DL5EU said in How to make widgets react to touches instead of mouse clicks:

    Touches are translated into mouse events. When I override the mousePressEvent method of the MainWindow, this method is called whenever I touch the screen. So far, so good. However, when I try to "click" a QPushButton by "touching" it, nothing happens.

    If you still have that even handler in MainWindow, it is "eating" events that would normally go to your push buttons. Either remove that code or at least set accepted to false:

    event->setAccepted(false);
    


  • Hello @sierdzio,

    I have added this code in MainWindow only for testing after I had discovered that nothing happened when I "touched" a PushButton and then deleted it again.

    I have now created a descendant of QPushButton and added code to show the reaction to mousePressEvents. Then I have added such a button to my MainWindow. When I click the button with the mouse it shows the events but when I "touch" it, nothing happens (as before).

    Is this a normal behaviour or am I doing something wrong? I cannot imagine that I am the first one who is trying to do something like this.

    Could it be that Qt does not correctly recognise the touch screen coordinates even if ts_test and ts_calibrate work as expected?

    Kind regards,

    Ralf


  • Moderators

    I don't know, sorry. Usually, touch events are treated like mouse clicks and there is no further tinkering necessary. So yes, maybe Qt is not getting the touch events properly... I don't know.



  • Hi,

    here are some news. Perhaps somebody can help.

    As far as I can tell up to now, the coordinates of the touch screen are not correctly detected or mapped or whatever.

    The problem is that the X and Y coordinates that I get in the mouse events that are generated when I touch the screen do not correspond to the coordinates in the events that are generated when I use the mouse. When I touch the screen at a position where the X and Y coordinates correspond to the position of the QPushButton it reacts as expected. So, the reason for the problem is clear.

    Any idea how to correct it? Please don't forget: the display is ok and ts_calibrate and ts_test work as expected (with rotation = 0). For information, on my Raspberry Pi 4 I use the OpenGL driver with fake KMS (EGLFS).

    Thank you very much for your help,

    Ralf



  • Hello,

    I managed to solve the problem. The touch screen was not correctly recognised.

    This combination solves the problem:

    QT_QPA_EGLFS_NO_LIBINPUT=1
    QT_QPA_EGLFS_TSLIB=1

    Probably the only combination that I had not tried :-) Apparently TSLIB has to be explicitly activated if LIBINPUT is deactivated.

    Kind regards,

    Ralf


  • Moderators

    Great, thanks for sharing the solution!


Log in to reply