Propagate Gestures between different Widgets in QML



  • Using Qt 5.5 on Android and iOS.

    I have implemented a User Interface that has several pages for the user to swipe through, left and right. Depending on some data, the pages are filled with different objects, i.e. some text, a picture, a video, whatever. Swiping left and right enables the user to go to the next or the previous page accordingly, much like the Android or iOS Home Screens, only with different content, obviously.

    To do so, I've implemented a .qml that has a full screen MouseArea to detect those swipes:

    Rectangle {
        id: coreRect
        // the touch recognizer goes all over the screen
        MultiPointTouchArea {
            anchors.fill: parent
            maximumTouchPoints: 2
            touchPoints: [
                TouchPoint { id: point1 },
                TouchPoint { id: point2 }
            ]
    
            onPressed: startGesture(touchPoints)
            onTouchUpdated: moveGesture(touchPoints)
            onReleased: endGesture(touchPoints)
        }
    
        function endGesture(touchPoints) {
            point1deltaX = point1.x - point1.startX
            if (point1deltaX > 100) {
                swipeRight();
            } else if (point1deltaX < -100) {
                swipeLeft();
            }
        }
    }
    

    Subsequently, there are child objects that are created depending on the data the qml recieves. As long as the child qmls don't need any interaction, we're fine of course, but one of the child objects is a text object that needs to be flickable in case it's bigger than the screen. As such, it looks like this:

    Rectangle {
        id: pageRect
        width: parent.width - parent.width/4
        height: parent.height - parent.height/5
        x: parent.width/10
        y: parent.width/10
    
        Flickable {
            id: regularTextFlickable
            anchors.fill: parent
            contentWidth: parent.width
            contentHeight: regularText.height
            flickableDirection: Flickable.VerticalFlick
    
            Text {
                id: regularText
                text: baseRect.normalString
                width: parent.width
                verticalAlignment: Text.AlignVCenter
                color: "#000000"
                font.family: "Trebuchet"
                font.pointSize: 30
                wrapMode: Text.Wrap
                z:2
            }
        }
    
        property string normalString
    
    }
    

    As I said, this is a child element of the above element coreRect, which also has the MultiPointTouchArea as a child.

    Now, depending on which z-Value is higher, I obviously either get my touches in the MultiPointTouchArea or the Flickable. I would want to be able to get it at either one, choose to ignore it if it's not vertical or horizontal respectively and propagate it to the other one.

    However, while I am able to use the gestureStartedsignal of the MultiPointTouchArea, I don't seem to be able to set the gesture to be ignored - and I'm not sure that this would even propagate it to the Flickable.

    On the end of the Flickable, I don't seem to be able to interact with the gesture directly at all (or at least the documentation doesn't tell me how to).

    As such, I don't see an easy way on how to do this. Obviously, I could write my "own Flickable", as in, use the MultiPointTouchAreato scroll the contents of the pageRect, but that seems unnecessarily complex, especially since Flickableworks extremely well.

    Is there an easier way to handle this, a "QML Best Practice"?

    On a sidenote: I do have a c++ "backend" to the qml files, although I doubt that matters much in this situation, unless I plan on writing my own c++ GestureRecognizer.



  • Have you looked at propagateComposedEvents ?



  • I am using a MultiPointTouchArea and a Flickable, not a MouseArea.


Log in to reply
 

Looks like your connection to Qt Forum was lost, please wait while we try to reconnect.