Qt 5 embedded screen rotation
-
@tsaG There does not seem to be any easy way – unless you are using QML.
However, it is possible that your platform supports rotation. For example, Raspberry Pi (kind of) supports rotation as a boot-up option. The rotation occurs then at a lower level (i.e. below the OpenGL layer). Other embedded platforms may provide similar functionality.
I actually tested both methods (with RPi/EGLFS). Using the QML rotation is relatively fast, at least I did not notice any significant speed difference between landscape and portrait orientations. Using the platform-specific rotation was slow, but that result cannot be generalized to other platforms. (RPi seems to fall into some sort of soft OpenGL when rotated by 90° or 270°.)
Of course, X11 or Wayland may provide a solution here, but at the same time the performance will suffer both at startup and during run time. YMMV.
This is a strange omission indeed, as screen rotation is not uncommon with embedded devices.
-
I found a solution that works for single touch. First I rotate the display:
sudo nano /boot/config.txt
Add this at the end for a 90 degree rotation:
display_rotate=1
Restart unit.
Then I Just catch the touch event, transforms the coordinate and then send a mouse event instead.
Creating a new class:
myguiapplication.h
#ifndef MYGUIAPPLICATION_H #define MYGUIAPPLICATION_H #include <QGuiApplication> #include <QEvent> class MyGuiApplication : public QGuiApplication { Q_OBJECT public: MyGuiApplication(int &argc, char **argv); virtual bool notify(QObject*, QEvent*); }; #endif // MYGUIAPPLICATION_H
myguiapplication.c
#include "myguiapplication.h" #include <QTouchEvent> #include <QMouseEvent> #include <QDebug> MyGuiApplication::MyGuiApplication(int &argc, char **argv) : QGuiApplication(argc, argv) { } bool MyGuiApplication::notify(QObject* target, QEvent* event) { try { switch (event->type()) { case QEvent::TouchBegin: case QEvent::TouchUpdate: case QEvent::TouchEnd: { QTouchEvent* te = static_cast<QTouchEvent*>(event); if (te->device()->type() == QTouchDevice::TouchScreen) { QList<QTouchEvent::TouchPoint> tps = te->touchPoints(); if(tps.count() != 1) { qDebug() << "Touch points != 1"; return true; } qreal tx = tps.first().pos().x(); qreal ty = tps.first().pos().y(); qreal mx = 480.0/800 * ty; qreal my = 800 - 800.0/480 * tx; qDebug() << tx << "," << ty << " => " << mx << "," << my; QPointF mp(mx, my); switch (event->type()) { case QEvent::TouchBegin: { qDebug() << "BEGIN"; QMouseEvent ee(QEvent::MouseButtonPress, mp, Qt::LeftButton, Qt::LeftButton, Qt::NoModifier); QCoreApplication::sendEvent(target, &ee); break; } case QEvent::TouchUpdate: { qDebug() << "UPDATE"; QMouseEvent ee(QEvent::MouseMove, mp, Qt::LeftButton, Qt::LeftButton, Qt::NoModifier); QCoreApplication::sendEvent(target, &ee); break; } case QEvent::TouchEnd: { qDebug() << "END"; QMouseEvent ee(QEvent::MouseButtonRelease, mp, Qt::LeftButton, Qt::LeftButton, Qt::NoModifier); QCoreApplication::sendEvent(target, &ee); break; } default: qDebug() << "Unhandled touch event"; return true; } return true; } break; } case QEvent::MouseButtonPress: { QMouseEvent *k = (QMouseEvent *)event; qDebug() << "MouseButtonPress:" << k->pos(); break; } case QEvent::MouseMove: { QMouseEvent *k = (QMouseEvent *)event; qDebug() << "MouseMove:" << k->pos(); break; } case QEvent::MouseButtonRelease: { QMouseEvent *k = (QMouseEvent *)event; qDebug() << "MouseButtonRelease:" << k->pos(); break; } default: ; } } catch (...) { } return QGuiApplication::notify(target, event); }
And then using my new class instead of QGuiApplication:
#include "myguiapplication.h" #include <QQmlApplicationEngine> #include <QQmlContext> int main(int argc, char *argv[]) { MyGuiApplication app(argc, argv); //... }