You should not mix painter and gl calls like that. You should put gl calls between calls to beginNativePainting() and endNativePainting(), which are painter functions. It might also help to save the painter state before and restore it after depending on how it messes up your gl stuff. the docs here tell you what the painter does to the states.
I worked it out. It was because the QOpenGLWidget uses an FBO under the hood, and so all calls get redirected to it. I guess Qt doesn't then do any special mapping of FBO color attachments to display buffers to allow for GL_BACK_LEFT and friends to continue working seamlessly.
Solution was to use a QOpenGLWindow instead, and I'm hoping that QWidget::createWindowContainer() will let me continue to work as expected.
@Devopia53 I tested on Qt 5.8/Win7 today. The same incorrect result. But today i discovered the difference in overlaping behavior on my two machines. On the machine with video card fully-transparent widget has black color on white opengl context :D, where as other machine with integrated video card has no color(as should). So the gui transparency on opengl context is very platform dependent. Solution is to render widget by opengl directly but this way is very inconvenient. I think troll-tech must solve this problem in future.
I was just expecting that QObject will be destroyed while QApplication is finished
Why? Qt doesn't track all objects, it guarantees only that the parent will delete the child, nothing more. So if you don't give a QObject a parent it'll just float out there and no one will delete it automatically, thus it will leak.
is it possible to use QApplication instance as a parent for QObjects
It is, because QCoreApplication (and consequently all its derived classes) is already a QObject. That's one of the reasons it's also sometimes referred to as the "root QObject".
when my QWidget has no parent then it is alive as long as app is working and it is managed and deleted by app.
You can't give a QWidget instance a QObject parent, but you can connect the QCoreApplication::aboutToQuit() signal to the QObject::deleteLater() slot and this'd solve your leakage problems.
Which means that claimed memory will be released by the OS, but here the problem is that you actually need to have the heap allocated objects' destructors invoked, which the OS will not do.
which as we know works like a bridge between real addressing space and page addressing space which is translated by OS
Yes, we know that, but there are two issues with this. Firstly, there may not be any difference between physical and logical addressing and this is common when the OS is ran without paging; like I run my Linux without a swap partition. And secondly, paging has next to nothing in common with the problem here.
in practice memory is not leaking when apps are destroyed
Even if this were true, which it is for most OS-es, not taking care to clean up your resources leads to all kinds of funky stuff. For the most simple and obvious of examples - a driver holding a global resource is not deinitialized, and the global resource is left locked for the next program wanting to use it. Then what, should we restart the whole system?
Just leaving the OS to clean up, especially in such obvious circumstances, is in my opinion a lazy approach to programming that does more harm than good in the long run. Not only don't a bit of discipline and a handful of good practices hurt, on the contrary, they will spare you the trouble of digging through deep stack traces unnecessarily ...
I think the driver of your graphics card isn't able to understand the GLSL command. As i see you're using an Intel Graphics Card, i had a lot of problems with my GLSL code on several machines with Intel Graphics.
Try to change the attributes and varying qualifiers:
GLuint shader = glCreateShader(...);
// Get strings for glShaderSource.
GLint isCompiled = 0;
glGetShaderiv(shader, GL_COMPILE_STATUS, &isCompiled);
if(isCompiled == GL_FALSE)
GLint maxLength = 0;
glGetShaderiv(shader, GL_INFO_LOG_LENGTH, &maxLength);
// The maxLength includes the NULL character
glGetShaderInfoLog(shader, maxLength, &maxLength, &errorLog);
// Provide the infolog in whatever manor you deem best.
// Exit with failure.
glDeleteShader(shader); // Don't leak the shader.
// Shader compilation is successful.
You'll need to change the context to the second GL widgets before using it.
As each time you render or do anything with OpenGL setup/parameters that you have the right context selected first for each widget.
QOpenGLWidget is works into an FBO. Probably the surface format was set to multisampled therefore the underlying FBO inherited that and reading GL_DEPTH_COMPONENT from a multisampled FBO is not possible. To check this set samlpes to zero in QSurfaceFormat and pass it to QOpenGLWidget.
I know they don't inherit one another, that's why I'm asking if it's possible. :)
I have a form with QOpenGLWidget laid out and want to use Qt3D, which does require a QWindow subclass for its painting routines. Now I was thinking maybe creating an attached QOpenGLWindow and sharing contexts but I'm not quite sure if this is the way. Anyway, createWindowsContainer is a possible solution, but I'd need to handle the layout insertion manually, this is why I'm asking.
@yoavmil Thank you, that resolved the issue. I have now set it up in main to show and then immediately hide the main window. I connected the splash screen close slot to a custom signal indicating when the model has finished loading. So now the main window becomes visible upon the model loading and the splash screen persists until receiving the very same signal.