Important: Please read the Qt Code of Conduct - https://forum.qt.io/topic/113070/qt-code-of-conduct
Screen hardware changed, now the touch points are wrong, can I fix this in Qt?
st2000 last edited by
I am sure there is a correct way to go about this. Unfortunately the code is not that well written. So, instead, I would simply like to change the numbers used by Qt for button presses.
What I would like to do is divide the X and the Y coordinates by, in this case, 1.33. Then I want to insert the new number back into Qt for use by Qt in identifying what buttons are pressed. I assume the eventHandler is run before any other code which deals with touch screen events. So I thought it would be a simple fix. But it is evident I don't know how to "insert" the new coordinates back into Qt such that they are used later when Qt is figuring out which button I have pressed.
The longer explanation is that we switch screen resolutions as both the display and touch screen hardware has changed. I have already made a quick fix in the display driver code to "stretch" all images, buttons and videos across the new screen. But the touch screen driver has no equivalent quick fix method. So I want to do this in the Qt code. I want a touch reported in the bottom left corner to be scaled to a point about 1/1.33 the way down the screen and about 1/1.33 the way to the right side of the screen.
Currently I am getting and changing the coordinates like this:
@ mouse_x = cursor->pos().x();
mouse_y = cursor->pos().y();
mouse_x /= 1.33;
mouse_y /= 1.33;@
But how do I get the values back into Qt such that they are used later when processing button presses?
Any help would be appreciated.
I tried to “push back” the altered mouse position into the Qt software (I am sure there is a better way to say this). I used the following line of code:
But the position of the buttons did not change. So I assume, even though the eventFilter runs first and I did the math, that the setPos does not alter how Qt buttons and other Qt features (that base their actions upon where the screen is touched) behave.
I suspect the problem with the above attempt to change the touch point position is that I only changed (set) the position in the instance of QCursor that I instantiated locally in this method. What I really need to do is change the position stored in the instance of Qt that instantiated my code. But how do I go about doing that? I suspect I start with the pointer passed to me during the instantiatetion of my code. But at this point, I could use a shove in the right direction.
Ok, the setting of the points code line was wrong. It should be:
...but I tried it and it did not work. The touch point on the screen remained unchanged. I had wanted to move the touch point as if they were on a sheet of rubber smaller than the screen that was anchored on top and left side of the screen. I wanted the touch point to move as if I stretched the sheet of rubber to cover the whole screen. That is, any Qt buttons would appear to move as the sheet of rubber was stretched.