Important: Please read the Qt Code of Conduct -

Querying OpenGL version / getting OpenGL > 2.x in Qt3D, OS X 10.11

  • All,

    I'm working on my first Qt3D app --- and first significant OpenGL project -- under Qt 5.7, OS X 10.11, XCode 7.3.1. By starting with some code from Qt3DExtras (and lots of mistakes) I've been able to construct a working scenegraph. My code (based margely on the QPhongMaterial class) uses the APIFilter to select between GL2 and GL3 vertex and fragment shader snippets.

    I want to go strictly OpenGL >3, but it sure seems like I'm consistently loading the OpenGL2 shaders (e.g., if I introduce a bug to the GL2 code, I get an error message, if I put a bug in the GL3 code, no error message). This seems strange to me as OS X should have GL 4.x.

    I don't even know where to start debugging. With Qt3D as an abstraction layer on top of OpenGL, I don't know how to get to the OpenGL context to read strings, etc.

  • Hi,

    When you create the view have you selected a version of OpenGl > 3 ?
    In Qt3DWindow constructor, qt team defined the surface with the following code:

        resize(1024, 768);
        QSurfaceFormat format;
    #ifdef QT_OPENGL_ES_2
        if (QOpenGLContext::openGLModuleType() == QOpenGLContext::LibGL) {
            format.setVersion(4, 3);

    If you want check if you can load a OpenGL 3 shader, comment the line where you add the shader in the QEffect:

       // _phongEffect->addTechnique(_phongAlphaGL2Technique);

    You can also check if you have add the good number of version on the top of you shader file.

  • I was able to get this working by explicitly setting the default QSurfaceFormat at the top of main():

        // Set default OpenGL version
        QSurfaceFormat format;
        format.setProfile( QSurfaceFormat::CoreProfile );
        QSurfaceFormat::setDefaultFormat( format );

    It's interesting the code from the Qt3DWindow constructor doesn't seem to be effective (?).

Log in to reply