Important: Please read the Qt Code of Conduct -

Textures and Android device

  • Textures on my Nexus 7 Android device isn't displayed at all. The older Qt samples work fine, but non of them are based on the new QOpenGL implementation while also displaying textures AFAIK. Everything works out fine whenever I build for my Linux system setting format to OpenGL 3.3 Core. I've tried asking for help previously in this forum without any reply, so I don't really know what to do.

    I'd be very grateful if anyone could link or show me some sample code that displays a textured quad that also works on an Android device, because I've tried so much up until this point that I feel I'm loosing my mind.

    Basically, everything works without any issue when I deploy on my computer, but for android build, the textures is displayed as if I'd chosen incorrect internal format when using glTexImage2D(..).
    Calls to glError is only returning GL_NO_ERROR and shaders are linked without issue.

    Here's some sample code I'm using:
    @void Texture::create(const QString &imageFile, GLFuncs* funcs)
    m_image = QImage(imageFile);
    int bpc = m_image.bitPlaneCount();
    funcs->glGenTextures(1, &m_boundTextureId);
    funcs->glBindTexture(GL_TEXTURE_2D, m_boundTextureId);

    GLenum format; GLenum internalFormat;
    internalFormat = bpc == 24 ? GL_RGB : GL_RGBA;

    #ifdef QT_OPENGL_ES_2
    format = internalFormat;
    format = bpc == 24 ? GL_BGR : GL_BGRA;

    funcs->glTexImage2D(GL_TEXTURE_2D, 0, internalFormat, m_image.width(), m_image.height(), 0, format, GL_UNSIGNED_BYTE, m_image.bits());
    funcs->glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

    By purposely mismatching internalFormat and format I can create a similar result on the PC client build to show a screenshot on how it sort of looks like on my Android device:

  • Small addendum:

    The reason I'm trying to draw textures with my own texture class is because QOpenGLTexture creates a 1280 error on the 2nd assert, using the following code:
    @ assert(m_funcs->glGetError() == GL_NO_ERROR);
    m_texGround = new QOpenGLTexture(QImage(":/Textures/android.png"), QOpenGLTexture::DontGenerateMipMaps);
    assert(m_funcs->glGetError() == GL_NO_ERROR);@

    This isn't giving me any issues on a desktop build. On a project that I'm currently working on, I'm using all the new QOpenGL stuff and it's been working without any issues at all.

    I'd be happy to create a minimalistic project and upload it if anyone would be kind enough to take a peek. I'd ask on stack overflow, but figured this is more of an issue with QT to Android than OpenGL itself. I'm inclined to think that the texture isn't really found or some such.

  • You should convert the QImage to a given format. Right now m_image could be in any format, most of which is incompatible with GL_RGBA/GL_UNSIGNED_BYTE. On desktop OpenGL it might work since you are most likely getting a RGB32 or ARGB32 QImage which, on little endian machines, is what GL_BGRA expects. The safe generic solution is to do m_image = QImage(...).convertToFormat(QImage::Format_RGBA8888) and then use GL_RGBA/GL_UNSIGNED_BYTE always.

  • Thank you for your response, that conversion is really useful. Unfortunately it does not change solve the issue. I took a picture with my phone and uploaded it to more accurately show the result.

    Desktop looks like this:
    Nexus 7 looks like this:

    This is my revised code after your insight:
    @void Texture::create(const QString &imageFile, GLFuncs* funcs)
    m_image = QImage(imageFile).convertToFormat(QImage::Format_RGBA8888);
    funcs->glGenTextures(1, &m_boundTextureId);
    funcs->glBindTexture(GL_TEXTURE_2D, m_boundTextureId);
    funcs->glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, m_image.width(), m_image.height(), 0, GL_RGBA, GL_UNSIGNED_BYTE, m_image.bits());

  • Hi, I always use the function "QGLWidget::convertToGLFormat": to convert the QImage to a valid texture format. I don't know if that makes any difference to the QImage::Format_RGBA8888 format, but if you look at the code they have implemented it in the QGLWidget sadly so they are not simply using the QImage class for the conversion. But at least it works like this, even on android and iOS!

    My code for the conversion:
    QImage image = ...
    QImage texture = QGLWidget::convertToGLFormat(image);

    GLuint textureId;
    glGenTextures(1, &textureId);
    glBindTexture(GL_TEXTURE_2D, textureId);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, texture.width(), texture.height(), 0, GL_RGBA, GL_UNSIGNED_BYTE, texture.constBits());

  • Thank you for your response. I've noticed this is what is done in the Texturing sample which works out fine on my Android device. I'd really like to use the new QWindow based approach, but may go back to QGL/Widget if I cannot figure out the issue.

  • No don't get me wrong, I use this approach with QML on Android, you don't have to use a QGLWidget, just call the static method QGLWidget::convertToGLFormat which returns a QImage you can then use with OpenGL textures. Sorry for the confusion, you just have to include the QGLWidget and possibly the OpenGLmodule of course, but only for this one function. :)

  • Oh, I see! It doesn't change anything, but it's nice to know that at least this part works for you on Android/iOS as it must mean the problem is somewhere else in my code.

  • Ok, so I'm not entirely sure why it happened, but I know what part caused the issue. Also, Agocs tip on format conversion works great as perhaps a cleaner alternative to the static QGLWidget function mentioned by Xander84.

    For testing purposes I created a number of primitive shapes that shared the same VertexArrayObject and VertexBuffer and used two shaderprograms, one for colored and one for textured. The issue seem to have been caused by setting the attribute buffers in incorrect order or something. For reference, here's my initialziation code. Please understand that it's just quick and dirty and not production worthy, nor optimized:

    @ m_vao.create();
    if(!m_vertexbuffer.bind()) assert(false && "could not bind");
    const int VertexSizePos = sizeof(vec3);
    const int VertexSizeTexCoord = sizeof(vec2);
    const int VertexAttribSize = VertexSizePos + VertexSizeTexCoord;
    for(int i = 0; i < 36; i++)
    m_vertexbuffer.write(iVertexAttribSize, &m_cube.vertices_[i], VertexSizePos);
    VertexAttribSize+VertexSizePos, &m_cube.texCoords_[i%6], VertexSizeTexCoord);
    int currSize = 36VertexAttribSize;
    for(int i = 0; i < 6; i++)
    VertexAttribSize, &m_plane.vertices_[i], VertexSizePos);
    m_vertexbuffer.write(currSize+i*VertexAttribSize+VertexSizePos, &m_plane.texCoords_[i%6], VertexSizeTexCoord);
    m_progColored->setAttributeBuffer("inPosition", GL_FLOAT, 0, 3, VertexAttribSize);

    m_progTextured->setAttributeBuffer("inPosition", GL_FLOAT, 0, 3, VertexAttribSize);
    m_progTextured->setAttributeBuffer("inCoord", GL_FLOAT, VertexSizePos, 2, VertexAttribSize);@

    The issue was caused by changing the order such that the following code
    @ m_progColored->bind();
    m_progColored->setAttributeBuffer("inPosition", GL_FLOAT, 0, 3, VertexAttribSize);@
    was done after the texture program block.
    I guess what I really should have done was just to keep it to one shader program or used one vertex buffer for each.

    Thank you both so much for pointing me in the right direction. Qt/Mobile is new to me, and I assumed it was an issue due to misconfiguration of an Android setting or restriction or some such.

Log in to reply