Android: Touch supporting
-
Hello,
I ported some of my desktop widget based applications to Android. Everything went -somehow- smoothly but all the ported apps don't support touch so the usability if all lists and tables is horrible.
Is there any easy way to enable touch support so whole application without manual control?
P.S. I read about QTouchEvent but I noticed that it used for manual supporting to each widget which is exhausted procedure.
-
Qt provides a widget attribute Qt::WA_AcceptTouchEvents. You need to set it on each widget if you want to receive QTouchEvents.
If not set (default) you receive mouse events instead. Meaning your widgets should generally work.Anyway if you would turn on touch support on a central place you also would had to recode your widget logic to support touch events?!
-
[quote]Qt provides a widget attribute Qt::WA_AcceptTouchEvents. You need to set it on each widget if you want to receive QTouchEvents.[/quote]
I read about it in some official Qt examples but unfortunately I found it somehow complicated so may you please if you have some snippet or tiny project share us here? -
complicated?
@QWidget::setAttribute(Qt::WA_AcceptTouchEvents);@
And now you receive QTocuhEvents for that widget. -
I meant using with QTouchEvents not Qt::WA_AcceptTouchEvents itself.
-
what exactly is complicated to you? IMHO the "docs of QTouchEvent":http://qt-project.org/doc/qt-5.0/qtgui/qtouchevent.html#details are very detailed and well understandable in terms of usage.
Basically they should be handled similar to mouse events:
touch begin -> mouse press
touch update -> mouse move
touch end -> mouse release
touch points -> mouse buttons -
You mean in case I want to add touch support for QListWidget I've to deal with:
touch begin -> mouse press
touch update -> mouse move
touch end -> mouse release
touch points -> mouse buttonsmanually? if that what you mean; I basically post this thread for find easy way to add touch support.
-
do you really expect that touch support comes out of nowhere?
Mouse events also need to be implemented. Who else than the developer should know how your widget reacts?
Also i don't see any code that the widgets implementation (e.g. of a QListView) contains code that handles touch events.I guess what you want to support is kinect scrolling on items views right?
You can use QScroller for that. But you would also need to add it manually.Thus it's actually recommended to use QML since they support mouse and touch events out of the box. QWidgets are also supported on mobile platforms but not the standard to use.
What would you could do is for example create a subclass of a QListView and in it's constructor add a QScroller object to itself. Then do a simple text replace so your sublcass gets created.
-
[quote]do you really expect that touch support comes out of nowhere?
Mouse events also need to be implemented. Who else than the developer should know how your widget reacts? [/quote]
It seems adding touch support to widget based apps isn't easy as I expected. I though it may support it simply just like qml based apps.[quote]Thus it’s actually recommended to use QML since they support mouse and touch events out of the box. QWidgets are also supported on mobile platforms but not the standard to use.[/quote]
Unfortunately migrating to qml isn't easy at all because I've to learn it from scratch and Qt Quick hasn't open source GUI plug-in for Qt Creator for creating complex interfaces while Qt Designer works fine with widgets. -
Guys I'm still stuck to this issue raven-worx finds it very simple while I find it an evil monster so I don't know from where I've to begin :(
Does any one contribute a tiny snippet explains how this works (best example QListView touch event)?
I read Qt Docs carefully and I found it very fuzzy! The simplest example uses QTouchEvent is Path Stroking which is complicated enough to understand QTouchEvent basics!
I tried the following as basic trail but it doesn't work!
[code]MainWindow::MainWindow(QWidget *parent) :
QMainWindow(parent),
ui(new Ui::MainWindow)
{
ui->setupUi(this);
ui->listWidget->setAttribute(Qt::WA_AcceptTouchEvents);
}void MainWindow::changeEvent(QEvent *e)
{
QMainWindow::changeEvent(e);
switch (e->type()) {
case QEvent::LanguageChange:
ui->retranslateUi(this);
break;
case QEvent::TouchBegin:
ui->statusBar->showMessage("Begin");
break;
case QEvent::TouchEnd:
ui->statusBar->showMessage("End");
break;
case QEvent::TouchUpdate:
const QTouchEvent const event = static_cast<const QTouchEvent>(e);
const QListQTouchEvent::TouchPoint points_ = event->touchPoints();
break;
// Damn error: /home/mbnoimi/Snippets/C++/Qt/TouchEventTest/mainwindow.cpp:34: error: jump to case label [-fpermissive]
default:
break;
}
}[/code] -
I tried the following but unfortunately it didn't work!
[code]QList<QWidget *> widgets = this->findChildren<QWidget *>();
foreach (QWidget *var, widgets) {
var->setAttribute(Qt::WA_AcceptTouchEvents, true);
}[/code] -
After readin all that I still do not know what you expect from the "touch" feature.
My widget based applications work fine out of the box with both mouse and touch screen. All GUI elements can be used with both.
So what else do you expect? And if you write that it does not work, what is "it" and what does not work?
-
The same problem, porting my QWidget based project from Windows to Android
i have QTreeView and right-mouse-button-menu and i want to make this menu open when i touch this QTreeView on my phone
so i did this:feedList = new QTreeView (this); ............. feedList->setSelectionMode(QAbstractItemView::ExtendedSelection); // чтобы можно было выделить несколько feedList->setContextMenuPolicy(Qt::CustomContextMenu); connect(feedList, SIGNAL(customContextMenuRequested(QPoint)), this, SLOT(slot_feeds_menu())); feedList->setAttribute(Qt::WA_AcceptTouchEvents, true); // для обработки нажатия на экран
and in eventFilter
if (event->type() == QEvent::TouchEnd/* && obj == feedList*/) { slot_feeds_menu(); }
but it has no reaction when i touch the screen