Important: Please read the Qt Code of Conduct - https://forum.qt.io/topic/113070/qt-code-of-conduct
Touch support in windows 7
galwacco last edited by
I have a multitouch application that runs smoothly if compiled against Linux, it has TUIO library for the interpretation of touches.
I'm currently trying to have it running under windows 7, and I'm getting the following weird behaviour:
- The application is capturing all the touch-points, and outputing the correct desired touchpointstates: touchpointpressed, touchpointmoved and touchpointreleased.
- I created a qDebug so it captures all the touchpoints generated, and returns the X, Y position where the touches happened as well as the states, as stated before;
- While everything seems to be working well, the weird thing is that the application won't consider the touchpoints' location to generate the released and pressed event at the respective touched point, rather, will ALWAYS consider where the mouse is placed.
Example: I have this virtual keyboard which I'm trying to type a simple word 'word', so I press the corresponding 'W' key, but unless my mouse is pointing the letter 'W', the application won't consider where my finger is pointing, even though the touchpoints are setting the correct mousemoveevent.
So, if the mouse isn't interacting along with my fingers at the touch surface, it won't work.
Again, it's working correctly for Linux. But under windows 7 it's giving me these weird results.
PS: I have all the enable multitouch settings correctly configured at my windows configuration.
Thanks in advance.