Touch gesture recognition with a custom QQuickPaintedItem?
-
Hi Guys,
I've a custom QQuickPaintedItem and i want it to be able to handle touch events;
the regular zoom/pan/rotate/context (like you would do on a map)so far i found that i can use the following calls in my constructor:
setAcceptedMouseButtons(Qt::AllButtons); setAcceptHoverEvents(true);
to get the following function calls:
virtual void mousePressEvent(QMouseEvent *event) {qDebug()<<"mousePressEvent(QMouseEvent *event) "<<event;} virtual void mouseMoveEvent(QMouseEvent *event) {qDebug()<<"mouseMoveEvent(QMouseEvent *event) "<<event;} virtual void mouseReleaseEvent(QMouseEvent *event) {qDebug()<<"mouseReleaseEvent(QMouseEvent *event) "<<event;} virtual void mouseDoubleClickEvent(QMouseEvent *event) {qDebug()<<"mouseDoubleClickEvent(QMouseEvent *event) "<<event;} virtual void wheelEvent(QWheelEvent *event) {qDebug()<<"wheelEvent(QWheelEvent *event) "<<event;} virtual void touchEvent(QTouchEvent *event) {qDebug()<<"touchEvent(QTouchEvent *event) "<<event;} virtual void hoverEnterEvent(QHoverEvent *event) {qDebug()<<"hoverEnterEvent(QHoverEvent *event) "<<event;} virtual void hoverMoveEvent(QHoverEvent *event) {qDebug()<<"hoverMoveEvent(QHoverEvent *event) "<<event;} virtual void hoverLeaveEvent(QHoverEvent *event) {qDebug()<<"hoverLeaveEvent(QHoverEvent *event) "<<event;}
but please tell me I've missed something?
I would expect that recognizing the default (1 & 2 finger) touch patterns should be available somewhere, but haven't find it yet,
is there a general recognizer somewhere? -
@RaDq1 said in Touch gesture recognition with a custom QQuickPaintedItem?:
touchEvent(QTouchEvent *event)
Do you get this event? This is the event you get for all kinds of supported touch events.
-
Yes, I do get the event, but that event is pretty basic as it contains all information but finding out if its a swipe or a pinch gesture is not something I expected to do myself; On the bottom of this page: http://doc.qt.io/qt-5/gestures-overview.html it says that there is no gesture recognition in qt quick so does every one really write their own recognition code? Isn't there some default code for that available somewhere?
If I look at the code of QQuickPinchArea, QQuickWindowPrivate etc it looks like every qtquick component is rolling its own recognition code whereas I just want to use something alike the QGestureManager (which seems only available for widgets).
Correct me if I'm wrong but I understood qt quick to be primarely for mobile development (where touch is extremely important) as where qt widgets are more for desktop environments (where touch is less important as keyboard/mouse are more common) how does this fit with the current state where touch support for qt quick is minimal (everyone rolls their own?) and there is extensive touch support for qt widgets, honestly I don't understand it and hope that I'm missing some obvious way to just recognize the standard gestures also in qt quick (for widgets they are in src\widgets\kernel\qstandardgestures)
basically what is the best way in qt quick to translate touchevents like:
QTouchEvent(TouchBegin device: "" states: TouchPointPressed, 1 points: (TouchPoint(0 (585.5,370 3x2) TouchPointPressed press 1 vel QVector2D(0, 0) start (587,371) last (587,371) delta (0,0))) QTouchEvent(TouchUpdate device: "" states: TouchPointPressed|TouchPointStationary, 2 points: (TouchPoint(0 (585.5,370 3x2) TouchPointStationary press 1 vel QVector2D(0, 0) start (587,371) last (587,371) delta (0,0), TouchPoint(1 (313.5,707.5 1x1) TouchPointPressed press 1 vel QVector2D(0, 0) start (314,708) last (314,708) delta (0,0))) QTouchEvent(TouchUpdate device: "" states: TouchPointMoved, 2 points: (TouchPoint(0 (581,382 2x2) TouchPointMoved press 1 vel QVector2D(0, 0) start (587,371) last (587,371) delta (-5,12), TouchPoint(1 (318.5,697.5 3x3) TouchPointMoved press 1 vel QVector2D(0, 0) start (314,708) last (314,708) delta (6,-9))) QTouchEvent(TouchUpdate device: "" states: TouchPointMoved, 2 points: (TouchPoint(0 (578,386 2x2) TouchPointMoved press 1 vel QVector2D(0, 0) start (587,371) last (582,383) delta (-3,4), TouchPoint(1 (321.5,695 3x4) TouchPointMoved press 1 vel QVector2D(0, 0) start (314,708) last (320,699) delta (3,-2))) ... QTouchEvent(TouchUpdate device: "" states: TouchPointMoved, 2 points: (TouchPoint(0 (496,478 2x2) TouchPointMoved press 1 vel QVector2D(0, 0) start (587,371) last (502,473) delta (-5,6), TouchPoint(1 (357.5,658.5 3x5) TouchPointMoved press 1 vel QVector2D(0, 0) start (314,708) last (357,662) delta (2,-1))) QTouchEvent(TouchUpdate device: "" states: TouchPointMoved, 2 points: (TouchPoint(0 (494,480.5 2x3) TouchPointMoved press 1 vel QVector2D(0, 0) start (587,371) last (497,479) delta (-2,3), TouchPoint(1 (358.5,658.5 3x3) TouchPointMoved press 1 vel QVector2D(0, 0) start (314,708) last (359,661) delta (1,-1))) QTouchEvent(TouchUpdate device: "" states: TouchPointMoved|TouchPointStationary, 2 points: (TouchPoint(0 (493,483 2x2) TouchPointMoved press 1 vel QVector2D(0, 0) start (587,371) last (495,482) delta (-1,2), TouchPoint(1 (358.5,658.5 3x3) TouchPointStationary press 1 vel QVector2D(0, 0) start (314,708) last (360,660) delta (0,0))) QTouchEvent(TouchUpdate device: "" states: TouchPointStationary|TouchPointReleased, 2 points: (TouchPoint(0 (493,483.5 2x1) TouchPointStationary press 1 vel QVector2D(0, 0) start (587,371) last (494,484) delta (0,0), TouchPoint(1 (360,660 0x0) TouchPointReleased press 1 vel QVector2D(0, 0) start (314,708) last (360,660) delta (0,0))) QTouchEvent(TouchEnd device: "" states: TouchPointReleased, 1 points: (TouchPoint(0 (494,484 0x0) TouchPointReleased press 1 vel QVector2D(0, 0) start (587,371) last (494,484) delta (0,0)))
to zoom out(factor x)?
-
@RaDq1 I thought you wanted to use widgets?
QtQuick:
"Qt Quick does not have a generic global gesture recognizer; rather, individual components can respond to touch events in their own ways. For example the PinchArea handles two-finger gestures, Flickable is for flicking content with a single finger, and MultiPointTouchArea can handle an arbitrary number of touch points and allow the application developer to write custom gesture recognition code."
So, it actually support gestures but without a generic implementation.For widgets you can easily recognize the gesture (http://doc.qt.io/qt-5/gestures-overview.html):
bool ImageWidget::gestureEvent(QGestureEvent *event) { qCDebug(lcExample) << "gestureEvent():" << event; if (QGesture *swipe = event->gesture(Qt::SwipeGesture)) swipeTriggered(static_cast<QSwipeGesture *>(swipe)); else if (QGesture *pan = event->gesture(Qt::PanGesture)) panTriggered(static_cast<QPanGesture *>(pan)); if (QGesture *pinch = event->gesture(Qt::PinchGesture)) pinchTriggered(static_cast<QPinchGesture *>(pinch)); return true; }
-
Is there any open source or implementation of the gesture recognition code available somewhere? (apparently outside the Qt framework)
any examples of how others do this?Ps: QQuickPaintedItem is part of Qt Quick 2.0, that's what I would like to use for my other GUI
Or is there a way to combine QtQuick and QtWidgets to make a widget inside a Qt Quick app that handles my custom drawing?
-
As I now understand it there is no easy way to recognize gestures inside the c++ part of a Qt Quick 'QQuickPaintedItem' inherited class, is that correct?
I need both pan and pinch in my c++ class so if i dont want to write my own recognition code should I then inherit my class from both 'QQuickPaintedItem', PinchArea and Flickable? that doesn't seem right? (besides that they are declared in private header files like qquickpincharea_p.h)
found the same request (also without an statisfying answer here:
https://forum.qt.io/topic/61108/recognize-arbitrary-gestures-in-qt5-like-gesturearea-in-qt4 )