Interactive QWidgets in OpenGL
I am trying to achieve something similar to what the unreal engine does with virtual reality editing:
I need to be able to render arbitrary qt widgets to a texture, display it in a plane in 3D and send back touch and scroll events to the widgets whenever the user interacts with them in virtual reality.
Has something like this ever been done before in Qt? (Even with just plain OpenGL and not in VR)
Is it feasible to render a qwidget to a texture then display it at 60+fps?
Would it be possible to do this without showing the widget in the main application's ui?
I'll update this thread with my findings and results from experiments.
Here are the requirements in checklist form:
- Render qt widgets to an image then upload it to an OpenGL texture
- Collect touch events in OpenGL then send them to the qt widgets
The project is on github if you're interested in helping out:
Thanks in advance.