Multitouch in Linux
Hi, does Qt 4 (e.g. 4.8.4) support multitouch in Linux (xorg)? I tried to use it in KDE, also ran some examples like fingerpaint, dials, etc. but was not able to get any multitouch behavior. I confirmed that multitouch is actually working by running some test programs written in java.
If Qt 4 actually supports multitouch, then what could be the problem? If not, then what about Qt 5?
By the way I'm using a laptop with a touch screen (win8 type)
I guess that you tested it with linux and KDE.
Was the Java app also running under linux?
If yes that kernel event works well.
You can also test on a terminal with
/dev/yourtouchdevices | hexdump
Anyway under Qt4 it was a bit of a pain to bring up Qt with multitouch.
I would suggest to use Qt5 it is much easier. As far as I know it should work out of the box.
I have made some tests with a USB touchscreen.
One must be very precise about "supports multitouch". Specify the base hardware platform, the input device (e.g. builtin touchpad), the OS, the window manager, the framework (Qt), the app (e.g. the Qt fingerpaint demo), the configuration of the drivers, and the gestures (e.g. 5-finger swipe.)
Loosely speaking, Qt 4 did support multitouch, but most distributions of Linux of the same age as Qt4 did not support multitouch, since fundamental machinery was still being developed (e.g. XInput2 and uTouch, the stuff that Canonical developed and uploaded back into the Debian chain.) And so on, there is a different conclusion for other platforms and versions. The multitouch situation was in flux.
For example, today, on a generic PC, Ubuntu 13.04, Unity window manager, a Bamboo Pen and Touch, withSettings>Wacom Tablet>Tracking Mode=Touchpad, and the Firefox Web browser as the app, pinch and two-finger scroll (which are multitouch) do seem to work. Yet with Tracking Mode=Trackpad(Absolute), multitouch has no effect on the browser.
Another consideration is: does the window manager let touch events through to an app? A year ago, Unity was consuming all the touch events (except for five-finger gestures). In other words, multitouch is very different on a single window phone than on a many windowed desktop. There is a fundamental conflict: is a gesture for the window manager or for the app in whose window the gesture was made? In other words, who is grabbing the touch events?
My simple test program in Python/PyQt/Qt5.02 (on the platform described above) doesn't seem to get any multitouch events (only mouseEvents even from the touch tablet.) I haven't figured out yet how to fix it, or whether it can be fixed. It seems to me that machinery is still missing, to let a user configure Unity so it doesn't grab or filter all the multi-touch gestures. I could be wrong.
When I connect my embedded touchpanel via USB with my LUbuntu Notebook everything worked fine. All Events came through.
Maybe test a "simple " window manager like "lxde" or "xfce".
Have you checked that the touch events arrived in "evdevtouch.cpp":https://qt.gitorious.org/qt/qt5base/source/43ee10518ab9870cc6f39cedc1febf9b7c57ebff:src/platformsupport/input/evdevtouch?
What kind of mutitouch protocol is your device using?
Juergen: yes, a different window manager may behave differently. I haven't tried yet, I am reading documentation for the window managers. But you understand the difficulty: a gesture is many touch events, and many gestures can be in the state of "in progress" simultaneously, and many programs (the window manager and an app) may be consuming the same events and gestures, and a gesture may be canceled. It seems to me that a window manager and an app in a window would need to cooperate.
Yes, I have seen that I might need to pass a command line argument "-plugin evDevTouch" to my Python app. (I don't think that would fix any cooperation problem.)
If by "device" you mean the Bamboo Pen and Touch, I don't think it matters what the protocol is. I have already tested that Firefox receives and acts upon multitouch gestures (e.g. pinch) from the device (although in such a glacial, jerky way as to be annoying.)
Juergen: I suppose you mean that I should compile qevdevtouch.cpp (which is the plugin) in debug mode and make sure it is able to connect to the device and receives events.
? Should I also be able to load the plugin using:
@loader = QPluginLoader("libqevdevtouchplugin.so")
result = loader.load()@
That returns False.
That has me wondering whether as a command line arg I need "-plugin libqevdevtouchplugin".
If I build the plugin with debugging enabled, the console would tell me whether the plugin was attempting to load, for either method of forcing it to load, or even if it might be automatically loaded for some other reason. (Its still not clear to me that setAttribute(Qt.WA_AcceptTouchEvents) should not start the necessary machinery.