Converting a QImage to a QByteArray and convert it back into an image
-
Hi everyone!
I need to convert a QImage to a QByteArray in order to send it through the network.
I tried to do it without the network part (i.e., convert the QImage to a QByteArray, and the QByteArray to a new QImage), but my app crashes every time.Here's my code:
QVideoFrame vf(frame); // First, we need to convert this video frame into an image vf.map(QAbstractVideoBuffer::ReadOnly); QImage::Format imgFormat = QVideoFrame::imageFormatFromPixelFormat(vf.pixelFormat()); QImage img = QImage(vf.bits(), vf.width(), vf.height(), QVideoFrame::imageFormatFromPixelFormat(vf.pixelFormat())); vf.unmap(); QSize imgSize = img.size(); QByteArray arr = QByteArray::fromRawData((const char*)img.bits(), img.byteCount()); if(arr.isEmpty() || arr.isNull()) qDebug("empty"); // Let's suppose the QByteArray has been received by a client and we want to build an image from it m_lastFrame = QImage((const uchar*)arr.data(), imgSize.width(), imgSize.height(), imgFormat); m_label->setPixmap(QPixmap::fromImage(m_lastFrame));
What did I do wrong?
Also, is it possible to convert a QVideoFrame straight into a QByteArray and then do the reverse?Any help would be greatly appreciated!
-
Problem fixed!
The image data has to be written into the buffer returned by the QImage::bits() method.
Here's the solution:
auto sz = img.byteCount(); uchar* a = new uchar[sz]; memcpy(a, img.bits(), sz); m_lastFrame = QImage(imgSize.width(), imgSize.height(), imgFormat); memcpy(m_lastFrame.bits(), a, sz); // We write the image data into the new QImage delete[] a;
And with a QDataStream:
QByteArray arr; QDataStream ds(&arr, QIODevice::ReadWrite); ds.writeRawData((const char*)img.bits(), img.byteCount()); ds.device()->seek(0); m_lastFrame = QImage(imgSize.width(), imgSize.height(), imgFormat); ds.readRawData((char*)m_lastFrame.bits(), img.byteCount()); // We read the data directly into the image buffer
Thank you so much for your help JKSH an koahnig!!
-
AFAIU you are creating an image and store it to a byte array by using image properties here
QByteArray arr = QByteArray::fromRawData((const char*)img.bits(), img.byteCount());
This will create a number of bytes. You are restoring your QImage here:
m_lastFrame = QImage((const uchar*)arr.data(), imgSize.width(), imgSize.height(), imgFormat);
Are you sure that the properties stored in imgSize, which you are using for restoration, will resemble the actual byte array size?
Best is to use the debugger and check where the actual crash is happening.
-
@tomatoketchup said in Converting a QImage to a QByteArray and convert it back into an image:
my app crashes every time.
Which line does it crash on? Does your stack trace provide any hints?
QImage::Format imgFormat = QVideoFrame::imageFormatFromPixelFormat(vf.pixelFormat());
What is the value of
imgFormat
?m_lastFrame = QImage((const uchar*)arr.data(), imgSize.width(), imgSize.height(), imgFormat);
You can use
QImage::fromData()
to convert a QByteArray straight into a QImage, without specifying sizes or using data pointers. (I'm guessing your crash is due to a wrong pointer or a wrong size parameter)Even better, you can also use
QDataStream
to convert a QImage directly into a QByteArray and back.If you are sending the image over a raw TCP connection, you can even use QDataStream to send the QImage into the QTcpSocket. However, I think using standard web protocols (using QByteArray) is more robust.
Also, is it possible to convert a QVideoFrame straight into a QByteArray and then do the reverse?
I don't think so. The QVideoFrame API does not know anything about QByteArray: http://doc.qt.io/qt-5/qvideoframe.html#map
Interestingly though, the QAbstractVideoBuffer docs mention constructing a QVideoFrame taking a QByteArray. Maybe the functionality was planned but not yet implemented.
-
@JKSH Thank you so much for your reply!
It seems like the program crashes on this line:m_label->setPixmap(QPixmap::fromImage(m_lastFrame));
And it is even weirder that instead of the previous line, the following works:
m_label->setPixmap(QPixmap::fromImage(m_lastFrame).transformed(QTransform().rotate(180)).scaled(m_label->size(), Qt::KeepAspectRatio));
What's happening?
The value of imgFormat is QImage::Format_RGB32.
I tried using the QDataStream but my program becomes unresponsive.
Here's what I did:QByteArray a; QDataStream ds(&a, QIODevice::ReadWrite); ds << img; ds >> m_lastFrame; // now we write it to the new QImage
Is QDataStream fast enough to keep up with the video frames? (this code is called each time I receive a videoFrameProbed(QVideoFrame) signal)
Interestingly though, the QAbstractVideoBuffer docs mention constructing a QVideoFrame taking a QByteArray. Maybe the functionality was planned but not yet implemented.
I think that it could be very useful and so much faster, we wouldn't have to convert the frames into images and back.
EDIT: By the way, do you know why the image I get from the QVideoFrame is upside down?
Here's a quick example:QVideoFrame vf(frame); vf.map(QAbstractVideoBuffer::ReadOnly); QImage::Format imgFormat = QVideoFrame::imageFormatFromPixelFormat(vf.pixelFormat()); QImage img = QImage(vf.bits(), vf.width(), vf.height(), QVideoFrame::imageFormatFromPixelFormat(vf.pixelFormat())); vf.unmap(); m_label->setPixmap(QPixmap::fromImage(m_lastFrame));
The displayed image in the QLabel is upside down, I have to apply a QTransform to flip it. Why?
Also, the QLabel is in a QGridLayout (I don't know if you need this information though). -
Also, it works fine without the QByteArray. The following line works:
m_lastFrame = QImage(img.bits(), imgSize.width(), imgSize.height(), imgFormat); // img is copied
But I need to store the bytes of the image in an array so I can send the image over a TCP or UDP connection.
-
@tomatoketchup said in Converting a QImage to a QByteArray and convert it back into an image:
It seems like the program crashes on this line:
m_label->setPixmap(QPixmap::fromImage(m_lastFrame));
And it is even weirder that instead of the previous line, the following works:
m_label->setPixmap(QPixmap::fromImage(m_lastFrame).transformed(QTransform().rotate(180)).scaled(m_label->size(), Qt::KeepAspectRatio));
What's happening?
Sounds like memory corruption occurring somewhere. These are a pain to track down.
What happens if you first create a deep copy of the QImage that you obtained from the QVideoFrame?
I tried using the QDataStream but my program becomes unresponsive.
Unresponsive you write it into the QDataStream? Or when you read it back out of the QDataStream?
Is QDataStream fast enough to keep up with the video frames? (this code is called each time I receive a videoFrameProbed(QVideoFrame) signal)
I don't know. It depends on multiple factors, like your frame rate, frame size, and CPU strength. Benchmark to find out.
(My initial gut feeling is that converting a video frame to an image then to a pixmap then displaying in a label is quite CPU-intensive and inefficient, but I don't have any solid data to prove either way)
EDIT: By the way, do you know why the image I get from the QVideoFrame is upside down?
...
The displayed image in the QLabel is upside down, I have to apply a QTransform to flip it. Why?No idea, sorry.
The best place to ask a bout the technical internals of Qt is the Interest mailing list (you need to subscribe first). Qt engineers are active on that list.
Also, the QLabel is in a QGridLayout (I don't know if you need this information though).
I don't think this matters. Nonetheless, you can simplify your test code by using a standalone QLabel.
@tomatoketchup said in Converting a QImage to a QByteArray and convert it back into an image:
But I need to store the bytes of the image in an array so I can send the image over a TCP or UDP connection.
QDataStream can send a QImage straight into a QTcpSocket or QUdpSocket.
Try it with a QImage that you load from a file. This way, you can investigate options while figuring out why the crashes happen.
-
Problem fixed!
The image data has to be written into the buffer returned by the QImage::bits() method.
Here's the solution:
auto sz = img.byteCount(); uchar* a = new uchar[sz]; memcpy(a, img.bits(), sz); m_lastFrame = QImage(imgSize.width(), imgSize.height(), imgFormat); memcpy(m_lastFrame.bits(), a, sz); // We write the image data into the new QImage delete[] a;
And with a QDataStream:
QByteArray arr; QDataStream ds(&arr, QIODevice::ReadWrite); ds.writeRawData((const char*)img.bits(), img.byteCount()); ds.device()->seek(0); m_lastFrame = QImage(imgSize.width(), imgSize.height(), imgFormat); ds.readRawData((char*)m_lastFrame.bits(), img.byteCount()); // We read the data directly into the image buffer
Thank you so much for your help JKSH an koahnig!!