Skip to content
  • 0 Votes
    4 Posts
    690 Views
    SGaistS

    What kind of data are these ?

  • 0 Votes
    2 Posts
    775 Views
    D

    Ok solved

    It appears that a GLOBAL surface setInterval overrides the setIntervals I did on my glWidget. Kinda weird that the global one overrides local one if local one was specified, is this a bug ?

    format.setSwapInterval(0); QSurfaceFormat::setDefaultFormat(format);

    overrides on widget >

    QSurfaceFormat format; format.setSwapInterval(0); setFormat(format);
  • 0 Votes
    1 Posts
    619 Views
    No one has replied
  • 0 Votes
    1 Posts
    411 Views
    No one has replied
  • 0 Votes
    6 Posts
    1k Views
    D

    Hey

    Thanks so much for your help!

    I spend a while longer on the topic, learning more about the buffer&widgetGL and now blit texture. I got as far as this project > https://sourceforge.net/projects/opengl-blit/files/

    But I cant figure out why the blit does not work. I can see that the frameObject has proper texture/render. But when I try to blit it I got nothing.

    Any help would be great!

    TIA.

  • 0 Votes
    1 Posts
    455 Views
    No one has replied
  • 0 Votes
    11 Posts
    2k Views
    M

    Using the minimal example above, I fixed this by removing this call in our QOpenGlWidget subclass's constructor:

    setAttribute( Qt::WA_PaintOnScreen, true );

    Removing this got rid of the paintengine calls (and numerous other problems).

    Thanks!!!!

  • 0 Votes
    4 Posts
    2k Views
    E

    @SGaist
    Thank you so much! Your response was much helpful!

  • 0 Votes
    2 Posts
    862 Views
    K

    @dalishi
    You should not mix painter and gl calls like that. You should put gl calls between calls to beginNativePainting() and endNativePainting(), which are painter functions. It might also help to save the painter state before and restore it after depending on how it messes up your gl stuff.
    the docs here tell you what the painter does to the states.

  • 0 Votes
    3 Posts
    1k Views
    SGaistS

    Hi,

    Glad you found out and thanks for sharing !

    What version of Qt are you using ?
    On which platform ?
    Did you check the bug report system to see if it's something known ?

  • 0 Votes
    1 Posts
    800 Views
    No one has replied
  • 0 Votes
    2 Posts
    1k Views
    S

    I worked it out. It was because the QOpenGLWidget uses an FBO under the hood, and so all calls get redirected to it. I guess Qt doesn't then do any special mapping of FBO color attachments to display buffers to allow for GL_BACK_LEFT and friends to continue working seamlessly.
    Solution was to use a QOpenGLWindow instead, and I'm hoping that QWidget::createWindowContainer() will let me continue to work as expected.

  • 0 Votes
    1 Posts
    2k Views
    No one has replied
  • 0 Votes
    2 Posts
    2k Views
    SGaistS

    Hi and welcome to devnet,

    Did you try with QStackedLayout with StackingMode set to StackAll ?

    Hope it helps

  • 0 Votes
    18 Posts
    6k Views
    jsulmJ

    @kickoune103 said in error after start example qt5.8 for raspberrypi:

    -v it's use before.

    Then please post the actual configure output (at least the part where fontconfig is being tested).

  • 0 Votes
    13 Posts
    5k Views
    M

    @Devopia53 I tested on Qt 5.8/Win7 today. The same incorrect result. But today i discovered the difference in overlaping behavior on my two machines. On the machine with video card fully-transparent widget has black color on white opengl context :D, where as other machine with integrated video card has no color(as should). So the gui transparency on opengl context is very platform dependent. Solution is to render widget by opengl directly but this way is very inconvenient. I think troll-tech must solve this problem in future.

  • 0 Votes
    8 Posts
    3k Views
    kshegunovK

    @NRUB

    I was just expecting that QObject will be destroyed while QApplication is finished

    Why? Qt doesn't track all objects, it guarantees only that the parent will delete the child, nothing more. So if you don't give a QObject a parent it'll just float out there and no one will delete it automatically, thus it will leak.

    is it possible to use QApplication instance as a parent for QObjects

    It is, because QCoreApplication (and consequently all its derived classes) is already a QObject. That's one of the reasons it's also sometimes referred to as the "root QObject".

    when my QWidget has no parent then it is alive as long as app is working and it is managed and deleted by app.

    You can't give a QWidget instance a QObject parent, but you can connect the QCoreApplication::aboutToQuit() signal to the QObject::deleteLater() slot and this'd solve your leakage problems.
    For example:

    int main(int argc, char ** argv) { QApplication app(argc, argv); QWidget * window = new QWidget(); QObject::connect(&app, &QCoreApplication::aboutToQuit, window, &QObject::deleteLater); window->show(); return QApplication::exec(); }

    OS is removing leftovers when app is finishing

    Which means that claimed memory will be released by the OS, but here the problem is that you actually need to have the heap allocated objects' destructors invoked, which the OS will not do.

    which as we know works like a bridge between real addressing space and page addressing space which is translated by OS

    Yes, we know that, but there are two issues with this. Firstly, there may not be any difference between physical and logical addressing and this is common when the OS is ran without paging; like I run my Linux without a swap partition. And secondly, paging has next to nothing in common with the problem here.

    in practice memory is not leaking when apps are destroyed

    Even if this were true, which it is for most OS-es, not taking care to clean up your resources leads to all kinds of funky stuff. For the most simple and obvious of examples - a driver holding a global resource is not deinitialized, and the global resource is left locked for the next program wanting to use it. Then what, should we restart the whole system?

    Just leaving the OS to clean up, especially in such obvious circumstances, is in my opinion a lazy approach to programming that does more harm than good in the long run. Not only don't a bit of discipline and a handful of good practices hurt, on the contrary, they will spare you the trouble of digging through deep stack traces unnecessarily ...

    Kind regards.

  • 0 Votes
    8 Posts
    6k Views
    beeckscheB

    @saket
    I think the driver of your graphics card isn't able to understand the GLSL command. As i see you're using an Intel Graphics Card, i had a lot of problems with my GLSL code on several machines with Intel Graphics.

    Try to change the attributes and varying qualifiers:

    static const char *vertexShaderSource = "#version 140\n" "in vec4 posAttr;\n" "in vec4 colAttr;\n" "out vec4 col;\n" "uniform mat4 matrix;\n" "void main() {\n" " col = vec4(1, 0, 0, 1);\n" " gl_Position = matrix * posAttr;\n" "}\n"; static const char *fragmentShaderSource = "#version 140\n" "in vec4 col;\n" "out vec4 fragcol;\n" "void main() {\n" " //gl_FragColor = col;\n" " fragcol = col;\n" "}\n";

    And try to get the Info from the shader after compilation (https://www.opengl.org/wiki/Shader_Compilation#Shader_error_handling)

    GLuint shader = glCreateShader(...); // Get strings for glShaderSource. glShaderSource(shader, ...); glCompileShader(shader); GLint isCompiled = 0; glGetShaderiv(shader, GL_COMPILE_STATUS, &isCompiled); if(isCompiled == GL_FALSE) { GLint maxLength = 0; glGetShaderiv(shader, GL_INFO_LOG_LENGTH, &maxLength); // The maxLength includes the NULL character std::vector<GLchar> errorLog(maxLength); glGetShaderInfoLog(shader, maxLength, &maxLength, &errorLog[0]); // Provide the infolog in whatever manor you deem best. // Exit with failure. glDeleteShader(shader); // Don't leak the shader. return; } // Shader compilation is successful.
  • 0 Votes
    3 Posts
    2k Views
    SGaistS

    Hi,

    What kind of camera are you going to connect to ? Are they already supported by GStreamer ?

  • 0 Votes
    2 Posts
    948 Views
    Fidchells_EyeF

    You'll need to change the context to the second GL widgets before using it.
    As each time you render or do anything with OpenGL setup/parameters that you have the right context selected first for each widget.