qt6 qt creator acceptTouchEvents on x11 ubuntu
-
I return on these because discover maybe not set properly touch event.
After these note: my ubuntu 24 work great with touch screen installed ... very reactive ..... perfect. But when work inside a project of qt creator 14 (every project ... example project too) inside project windows or inside qtcreator windows, touch screen is not so reactive. When use touch outside qt creator or qt app, touch screen work again perfectly, but if return inside windows of qt app touch is unresponsive ......
Any how in my mainwindows I insert these attribute:
QtCoreApplication::setAttribute(Qt::WA_AcceptTouchEvents); QtApplication::setAttribute(Qt::WA_AcceptTouchEvents); QApplication app(argc, argv); ........... MainWindow mainWin; mainWin.show(); return app.exec();
I done something wrong? because my app somethings is unresponsive on touch, better: if press in some point ... over button for example .... button is not press .... not the first not the second maybe at third time try to press it ....... with mouse not problem at all ....
prreciate any help .... previously (maybe bad explain my cases) not have it on these arguments.
regards
-
@afalsa ..... aftyer your suggestion I note my qicon on app not show in right manner (..... yes I loose some time to search for stupid things for relax me ....). My previous code:
QString hpB1 = QDir::currentPath(); QString hP1 = QDir::homePath(); // Set the homePath in the Globals singleton Globals::instance().setHP(hP1); Globals::instance().setHPB(hpB1); MainWindow w; w.setWindowTitle(QApplication::translate("w", "bla bla bla")); w.setAttribute(Qt::WA_AcceptTouchEvents, Qt::Window); w.setAttribute(Qt::WA_DeleteOnClose); w.setWindowIcon(QIcon(hpB1 + "/resources/ls_black.png")); /* note these row */ w.setWindowIconText("GreenMonitor"); w.resize(1260,800); //w.move(( scr.center() - rect().center() )); w.show();
and after me relax time I try for joke these different one ....
QString hpB1 = QDir::currentPath(); QString hP1 = QDir::homePath(); QString joker = hpB1 + "/resources/ls_black.png"; // Set the homePath in the Globals singleton Globals::instance().setHP(hP1); Globals::instance().setHPB(hpB1); MainWindow w; w.setWindowTitle(QApplication::translate("w", "bla bla bla")); w.setAttribute(Qt::WA_AcceptTouchEvents, Qt::Window); w.setAttribute(Qt::WA_DeleteOnClose); w.setWindowIcon(QIcon(joker)); /* note these row */ w.setWindowIconText("GreenMonitor"); w.resize(1260,800); //w.move(( scr.center() - rect().center() )); w.show();
now magically my touch work as expect ..... really stupid but real ..... someone can explain why these?
regards
-
-
-
-
So actually for run my touch on main I add these row:
#include "mainwindow.h" #include <QApplication> #include <QDialog> #include <QScreen> #include <QIcon> #include <QGuiApplication> #include "login.h" #include "globals.h" #include <QTouchEvent> int main(int argc, char *argv[]) { QCoreApplication::setAttribute(Qt::AA_SynthesizeMouseForUnhandledTouchEvents, true); QCoreApplication::setAttribute(Qt::AA_SynthesizeMouseForUnhandledTouchEvents, true); QApplication a(argc, argv); QApplication::setOrganizationName( "created by bkt" ); QApplication::setOrganizationDomain( "github_bkt" ); QApplication::setApplicationName( "bkt_data" ); //QScreen *screen = QGuiApplication::primaryScreen(); //QRect scr = screen->geometry(); QString hpB1 = QDir::currentPath(); QString hP1 = QDir::homePath(); QString joker = hpB1 + "/resources/ls_black.png"; // Set the homePath in the Globals singleton Globals::instance().setHP(hP1); Globals::instance().setHPB(hpB1); MainWindow w; w.setWindowTitle(QApplication::translate("w", "bla bla bla")); w.setAttribute(Qt::WA_AcceptTouchEvents, Qt::Window); w.setAttribute(Qt::WA_DeleteOnClose); w.setWindowIcon(QIcon(joker)); /* note these row */ w.setWindowIconText("GreenMonitor"); w.resize(1260,800); //w.move(( scr.center() - rect().center() )); w.show(); return a.exec(); }
and after inizilize UI on mainwindows i write:
MainWindow::MainWindow(QWidget *parent) : QMainWindow(parent), ui(new Ui::MainWindow) { ui->setupUi(this); this->setAttribute(Qt::WA_AcceptTouchEvents, Qt::Window); this->setAttribute(Qt::WA_WState_AcceptedTouchBeginEvent);
I write all these 2 week ago ... but never check about path of qicon ... not think these is a problem ..... but for sure actually icon on icon app not is show .... instead touchscreen work perfectly ....
-
-
I think my knoledge in this topic is limited.
The only thing i can tell you is to try to show qt logs and try to find the problem...
https://doc.qt.io/qt-6/embedded-linux.html#logging
https://rsadowski.de/posts/2023-01-05-qt-kde-controlling-debug-messages/`Good luck
-
-
I add these: if I try to play with cinammon desktop features for mouse event (less or more delay on clicck etc etc .... leave out touchpad control) touchscreen on qt6 give a little better result .... now seems click is bring when finger press exactly over text in a button .... but non when press on button ..... if try to make a 250px x 250px button with a button text at 24px .... if press on text all work as normal .... but if press on empty area of button no touch event ...... so these seeems related to a some sort of "z" level in xml file of button (or a similar situation on button file ( i suppose these, because not know if qt use xml for draw and render widget) not at secondary screen of touch or similar issue ....
-
can these code solve problem?
cmake_minimum_required(VERSION 3.16) project(MyTouchApp) # Find required Qt modules find_package(Qt6 REQUIRED COMPONENTS Core Gui Widgets) # Add the executable add_executable(MyTouchApp main.cpp MyWidget.cpp) # Link Qt libraries target_link_libraries(MyTouchApp PRIVATE Qt6::Core Qt6::Gui Qt6::Widgets) # Optional: Add libinput find_package(PkgConfig REQUIRED) pkg_check_modules(LIBINPUT libinput) if(LIBINPUT_FOUND) target_include_directories(MyTouchApp PRIVATE ${LIBINPUT_INCLUDE_DIRS}) target_link_libraries(MyTouchApp PRIVATE ${LIBINPUT_LIBRARIES}) endif() # Optional: Add evdev pkg_check_modules(LIBEVDEV libevdev) if(LIBEVDEV_FOUND) target_include_directories(MyTouchApp PRIVATE ${LIBEVDEV_INCLUDE_DIRS}) target_link_libraries(MyTouchApp PRIVATE ${LIBEVDEV_LIBRARIES}) endif()
-
but is not a solution .... I try to build on qt6 calculator example and touch work without any issue ..... so try to built a gui-test-project (perhaps without select any touch attribute ..... and set all as a desk app) with some button .... and on mainwindows touch work good ..... so no issue on touch and qt6. Problem is obviusly generated in some strange way from my app .... strange because using mouse click all work perfect on my app .... anly touch not work responsive (sometime if try to press a button ... nothing appens ... animation of button too .... but if use mouse all work everitime). .... my app have multiple page based on tab widget .... and sometime on bottom of one button there are an other invisible and disable ..... maybe these that generate these issue .... but need to investigate better ....
-
I have a function that scan a long list of input and output (100 + 100 more or less) than change "state" of some widget on my gui .... that function work based on a timer start from gui with time of 100ms .... can be these that generate my problem? I think not because with mouse all work perfectly ...... my function work on qt5 perfectly with mouse or with touch .... only qt6 version have problem with touch ..... about these it may be that this further explanation could help someone come up with an idea that would help me solve the rather boring problem... I've been wasting my time on it for 2 months now.
regards
-
@gfxx said in qt6 qt creator acceptTouchEvents on x11 ubuntu:
I have a function that scan a long list of input and output (100 + 100 more or less) than change "state" of some widget on my gui ....
How long does this function take to execute?
-
I ran a whole series of tests (using a long test GUI in qt6 with a lot of command and launch from qtcreator GUI) and discovered that, in reality, Qt6 works well with mouse, keyboard, and touch in every condition.
Problem: my Qt6 application is launched by another program and acts as its GUI... ONLY in this case, the touch does not work properly.Now, I tried creating a launch bash script where I write:
export QT_LOGGING_RULES="qt.qpa.input=true"
and then launch the application, which in turn opens the GUI in Qt6... In this case, the touch works fine.Why? How can I include a command in my application to define this rule?