GLSL programming...



  • I have a QGLWidget with a GLSL program that uses the RGBA values in a RGBA32F texture as a vertex buffer, reading the vertices using a texel fetch in the vertex shader. For the most part, everything works. The issue I am having is in the paintGL function. Here is what I think I should be using based on the Qt Cube example:
    @
    // BIND OUR VERTEX BUFFER SO THAT THE GRAPHICS CARD CALLS
    // THE VERTEX SHADER FOR EACH AND EVERY POINT IN THE SCAN
    glBindBuffer(GL_ARRAY_BUFFER, vboIds[0]);

    // Tell OpenGL programmable pipeline how to locate vertex position data
    int vertexLocation = program.attributeLocation("qt_Vertex");
    program.enableAttributeArray(vertexLocation);
    glVertexAttribPointer(vertexLocation, 2, GL_SHORT, GL_FALSE, 2 * sizeof(short), (const void *)(0));
    
    // BIND THE INDEX BUFFER SO THAT THE DRAW ELEMENTS COMMAND KNOWS HOW TO CONNECT VERTICES
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vboIds[1]);
    
    // BIND THE SCAN'S XYZA COORDINATES AS TEXTURES
    glActiveTexture(GL_TEXTURE0);
    glBindTexture(GL_TEXTURE_2D, texture);
    
    // PASS THE HANDLE TO OUR FRAME BUFFER OBJECT'S TEXTURE
    program.setUniformValue("qt_texture", 0);
    
    // DRAW THE ELEMENTS
    glDrawElements(GL_TRIANGLES, numInds, GL_UNSIGNED_INT, 0);
    

    @

    However, this is not working. I have to replace the lines:
    @
    // Tell OpenGL programmable pipeline how to locate vertex position data
    int vertexLocation = program.attributeLocation("qt_Vertex");
    program.enableAttributeArray(vertexLocation);
    glVertexAttribPointer(vertexLocation, 2, GL_SHORT, GL_FALSE, 2 * sizeof(short), (const void )(0));
    @
    with
    @
    glEnableClientState(GL_VERTEX_ARRAY);
    glVertexPointer(2, GL_SHORT, 2
    sizeof(short), 0);
    @
    to get it to work. Does anyone know why this would be the case?



  • Okay, I found my mistake. In the vertex shader, I have the following:

    @#extension GL_EXT_gpu_shader4 : require

    uniform sampler2D qt_texture; // THIS TEXTURE HOLDS THE XYZ+TEXTURE COORDINATES
    uniform mat4 qt_projection; // THIS MATRIX IS THE PROJECTION MATRIX
    uniform int qt_rows; // THIS INTEGER HOLDS THE HEIGHT OF THE TEXTURES
    uniform int qt_cols; // THIS INTEGER HOLDS THE WIDTH OF THE TEXTURES
    uniform int qt_mode;

    attribute vec2 qt_vertex; // THIS VECTOR HOLDS THE K=1 DFT COEFFICIENTS

    varying float qt_magnitude; // THIS OUTPUT VALUE HOLDS THE TEXTURE COORDINATE
    varying vec3 qt_normal;

    void main(void)
    {
    // FIGURE OUT WHICH VERTEX THIS IS USING THE VERTEX ID
    int row = gl_VertexID / qt_cols;
    int col = gl_VertexID % qt_cols;
    @

    But I'm not actually doing anything with qt_vertex. I just declared it thinking it would receive the incoming vertices, but since I'm not using those directly, I just never use the qt_vertex in the code. So the compiler was pretty much deleting that variable since its not being used. So when I try to connect to it my paintGL function, its not making a connection since it deleted from the code. So I was getting a -1 returned for the qt_vertex handle.

    My fix was to just include a useless reference to the qt_vertex variable in the main() function to read:
    @void main(void)
    {
    // WE NEED TO ACTUALLY USE THE QT-VERTEX FOR IT TO CONNECT TO THE OPENGL CODE
    qt_vertex;

    // FIGURE OUT WHICH VERTEX THIS IS USING THE VERTEX ID
    int row = gl_VertexID / qt_cols;
    int col = gl_VertexID % qt_cols;
    

    @

    Now everything works fine.


Log in to reply
 

Looks like your connection to Qt Forum was lost, please wait while we try to reconnect.