[Qt3D] Rendering QML as a texture
-
Hello,
I have been trying for the last weeks to port the example rendercontrol to work within qt3d. In the example we render a QML code into an offscreen surface FBO, and that FBO is in turn used as a texture on a cube drawn in a QOpenGL window.
None of what I tried work, so I may ask no necessarily for a fully coded solution but at least leads on how to do it.This is what I tried :
1/Using directly an id reference to an Item.
Obviously the first thing I did was to try and use a DiffuseMapMaterial using as a source an id referencing an Item (and QML substructure). It didn't work because digging in the qt3d srcs, DiffuseMapMaterial expect an url as a source for the image. There is an undocumented image loader which is quite complicated, so I decided to try something else.2/Subclassing QImageProvider
This method works with normal QML, as in, the FBO is dumped into a QImage, and I can manipulate the image in 2d. However the imageprovider url is not recognised by DiffuseMapMaterial within a qt3d entity. I was using a rgba32 format, so it may be that the format is the issue, but I am not too sure how to ascertain this. Anyway black object and "could not read the texture" error.3/Using ShaderEffectSource
Tried to use a generic qt3d Material and reference a ShaderEffectSource as an effect.
Effect has a different meaning whether in QML and in Qt3d, and are visibly not cross compatible. Also there were visibly some issues with no resetting the openGL context. Too bad, code wise it was the easiest. White screen ,4/Subclassing Material, and pass the FBO as a Texture2D.
That one would seem to most likely to work, but i have researched the documentation and the sources, and there are no convenience method to assign a QImage (generated from the FBO) to a QTexture2D or a QAbstractTexture.5/Various renderpasses
Looking at the shadowmap example, I fugured that there could be a way of using an equivalent technique using the renderpasses, but again , the shadowmap example renders from a different perspective, but does not perform any context switch. Unless I got it wrong.I have read in the 2016 (a priori, the currently implemented API) mailing list archives something like : " You can use the functor of QTextureImage and either pass your pre-prepared image in a as a member of your custom functor or you can do the actual painting using QPainter in the functors function call operator." That quote was in relation with drawing something using QPainter, and using the resulting QImage as a QTextureImage. But every time I read the above, I feel like an imbecile, and would love to have someone clarifying the statement. With plain words.
For me I am under the impression that I will need 2 different openGL contexts, one created especially for the QML code to be rendered , and different to the one for the qt3d scene. I can manage that, based on several multi-threaded examples, but I'd like to have a bit of a push towards the right direction.
Help anyone, pretty please ?
-
This is long awaited functionality, and there is some WIP commits in Gerrit regarding this.
May be Sean will add more -
Indeed that's the feature we all dream of. The WIP patch in question can be seen here https://codereview.qt-project.org/#/c/154545/.
You are correct into assuming that you'll likely need 2 OpenGLContext (the Qt3D one which you can't access + another for rendering your QML scene) which makes sharing texture ids problematic.The functor idea relies on the principle that a Qt3DRender::QAbstractTextureImage subclass can provide its own functor (now called dataGenerator).
http://doc.qt.io/qt-5/qt3drender-qabstracttextureimage.html#public-functions
http://doc.qt.io/qt-5/qt3drender.html#QTextureImageDataGeneratorPtr-typedefYou can basically store in a QImage in your functor and implement the operator() to return a QTextureImageData which you have constructed from your QImage (QTextureImageData->setImage()). The thing is that you have to create a new functor with a new QImage everytime your QML changes so that Qt3D can update it.
-
Thanks a lot Vlad and Paul for your answer. I believe too that this functionality is probably what will make Qt3D stand out entirely from any other 3d engine in terms of end user purpose.
From the comments in Gerrit, I gather that this has a chance of merging in 5.8 or 5.9, hasn't it ? Do you know when it would be released ?
I read through the WIP code, and while I do understand most of the multithreading mechanisms, I failed to find the stage at which the FBO articulates into a texture. But again , I haven't probably spent enough time combing the code.
Meanwhile, I also tried to do a bit of homework on my side (never had to use functors before...). Taking the risk of writing something stupid, if I was to implement a super basic class, then tentatively a starting point would be :
class qImageToTextureFunctorClass { public: qImageToTextureFunctorClass (QImage qImageFromFBO) : _qImageFromFBO( qImageFromFBO ) {} QTextureImageData operator() (QOpenGLFramebufferObject fbo) { _qImageFromFBO = fbo->toImage(); return QTextureImageData->setImage(_qImageFromFBO); } private: QImage _qImageFromFBO; };
Admitting that the render is triggered for instance through a cpp bound function called at a FrameAction location on the QML side, which post the render event to the QRenderControl.
First question, the returned value is not clear in my pseudo code ... I may not be using setImage() appropriately.
Secondly, upon completion of the QML render, what would be the relation between a QTextureIamgeData, a QTextureImageDataGeneratorPtr and a QAbstractTexture ?