Dynamic Screen rotation from Accelerometer; Qt5 QApplication, QPA linuxfb

  • Qt5 QApplication working with a somewhat custom QPA plugin inheriting from qlinuxfb plugin that currently handles rotating 90 to display in portrait mode. I'm attempting to add the capability of dynamic screen orientation (top-up and top-down, always portrait not landscape) based on readings from an accelerometer. I have a separate process monitoring the sensor and generating dbus signals on whether to display the screen normally or upside down to the qt application. I'm able in the platform plugin to rotate the image correctly so it appears upside down (hard coded rotation). Here's where I'm having trouble:

    Once the application is aware of the need to flip the screen, how do I convey this information into the platform plugin? Similarly, how do I also translate raw touch coordinates to match the rotated screen?

    A few thoughts:

    • I've came across QSensor in my searches, I'd like to avoid usage of this class for now unless it greatly simplifies things.
    • The mechanisms for where QPA support exists in Qt is not known to me.. is it as simple as create some variable in QPlatformIntegrationPlugin that knows flipped or not flipped, a public function that sets the var upon reception of the dbus signal?
    • Event handling depends on builtin EVDEV support.. seems like the only way to translate touch events would be to inherit from the QEvdevTouchManager, add support for knowing flipped/not flipped, and override some function that will translate coordinates?

    Update: Looking through QPA source, it looks like I should be overriding QPlatformScreen::orientation to return either Portrait or InvertedPortrait and have the main application call setOrientationUpdateMask to both of these orientations instead of monitoring accelerometer in a separate process and letting Qt application know over Dbus.

    It doesn't seem like QPlatformScreen::orientation() is called continuously Now state of orientation is available from QPA layer, but the question still remains is this all I need to do? I see QGuiApplication has some support for detecting orientation event changes, will this automatically translate touch inputs based on screen orientation?

  • Had some time to dig a bit more on docs and code. Looks like I should reimplement orientation() for QScreen subclass at least and have app set orientationUpdateMask on startup. I could use a private var mOrientation in QScreen subclass and do rotation based on this. Still not sure how and where touch event coordinates should be translated. Any ideas?

    Also: I maybe answered this myself when looking through code, but is orientation() called in the main execution loop of qapplication? Need to ensure mOrientation is continuously updated in QScreen subclass?

  • SO. An update for anyone who may come across this later.
    Screen rotation:
    I couldn't figure out a way for the QPlatformScreen subclass I was dealing with to continuously monitor the accelerometer and update the orientation QPlatformScreen::orientation(). I only saw a call into this function during initialization and not continuously as I had hoped. QGuiApplication::primaryScreen()->orientation() never called into the QPlatformScreen implementation so no querying of the accelerometer was done in the app.
    Since I wasn't making progress this way, I went back to the original method of an outside process monitoring the accelerometer and notifying Qt App over DBus. Next, upon receipt of this message, I call QWindowSystemInterface::handleScreenOrientation(screen, orientation) (not part of the API, could change with versions) which allows the entire application to then know the orientation by querying QScreen::orienation(). QPlatformScreen then can do rotation itself in hardware based on current orientation.

    Touch Event Rotation Handling:
    Somewhat working, not complete. Installed an event filter on QApplication, caught touch events as they came in, created new QTouchEvent based on received events with transformed coordinates, and sent these new events along with QCoreApplication::sendEvent and returned true for handling of original event. Was baffled as when orientation was inverted, the filter wasn't activating anywidgets. Learned default behavior of events in QWidget is to synthesize mouse events if a QTouchEvent goes unhandled, and this is what is actually handled in QWidgets. So modified filter to only work on mouse events: this required x and y transformations of course but also setting source to MouseEventSynthesizedByApplication so as it was passed to each widget it wasn't continually transformed (if synthed by app, don't do x/y transformation again).

    Gestures are still broken, as I believe I still need to handle touch event transformation, but without breaking the synthesis of QMouseEvents by Qt. (PanGesture still is recognized when inverted, but pans in the incorrect direction).

    If anyone has suggestions on this last part, let me know!

Log in to reply