Problems with Open-Source Downloads read https://www.qt.io/blog/problem-with-open-source-downloads and https://forum.qt.io/post/638946

QImage::setColorTable for RGB16 does not work?



  • Hi,
    I have some issue related to setting color table to RGB 16 images. It looks like setColorTable does not take effect.
    I have created dummy (256x256px) image and dummy color map. I expect to see image which contains "vertical bars" defined by color map but I get something completely different.
    Short example:

            //create data
            const unsigned short dimension = 256;
            unsigned short* fake_data = new unsigned short[dimension * dimension];
    
            //create dummy color table
            QVector<QRgb> fake_color_table(dimension*dimension);
            for (int i = 0; i < dimension*dimension; ++i)
                fake_color_table[i] = i % 2 ? 0xFFFF0000 : 0xFF0000FF;
    
            //fill data
            for (unsigned short f = 0; f < dimension; ++f) {
                for (unsigned short b = 0; b < dimension; ++b) {
                    unsigned short index = f*dimension + b;
                    fake_data[index] = index; // qConvertRgb32To16(color_table[index]);
                }
            }
    
            QImage image(reinterpret_cast<unsigned char*>(fake_data), dimension, dimension, sizeof(unsigned short) * dimension, QImage::Format_RGB16);
            image.setColorTable(fake_color_table);

  • Lifetime Qt Champion

    Hi
    As far as i understand setColorTable only works with
    QImage::Format_Indexed8 and monocrome
    from here
    http://doc.qt.io/qt-5/qimage.html#image-information



  • @mrjj Thanks. I started to develop to fast and missed (skipped;p ) in depth reading before...sorry



  • @jakub_czana It's probably worth mentioning that indexed color modes aren't very widely supported any more, so using an indexed mode and color table may result in a lot worse performance than just displaying a normal RGB32 image on modern systems, with the extra cost of having to manually screw around with managing colortables. What exactly are you trying to do?



  • @wrosecrans Thanks for clarification. This was kind of exceptional case. I had 16 bit image and small color map which gives me 32bit ARGB values for each pixel. Performance was not crucial here so I simply created 32 bit data buffer, filled it with ARGB desired values and created QImage(ARGB32) out of it. I just tried to use setColorTable but it did not work so I asked here. Too soon I guess as I could have read more before...



  • @wrosecrans I forgot to write that code I posted is just an example. It was a way for me to verify that I see colors from given color table on the screen.



  • @jakub_czana What format was the 16 bit image in? It's very unusual to have a 16 bit indexed image. I don't think I've ever run across something like that. Most 16 bit formats will already be some sort of RGB. You'd need a 65535 entry color table for a 16 bit indexed image. Just packed 8 bit RGB values would make the color table as large as a whole 512x384 a bit image. Which is why I am still curious about what use-case originally inspired your question.

    If you have a single channel of 16 bit data that you want to colorize for display, you can do that with a small LUT, rather than a full color table. Something like viewing an X-Ray in false color might use that sort of approach. You might want to look into a library like OpenColorIO for doing LUT handling if that's closer to your underlying use case.


Log in to reply