Unsolved Trying to capture QImage of video frame
-
I have been trying unsuccessfully to capture images of video frames played by a QMediaPlayer. I want to store the copied images in a List<QImage*>. Attached is an application which has been used in this effort. The app is a composite of code found at http://www.omg-it.works/how-to-grab-video-frames-directly-from-qcamera and the VideoWidget example from Qt. The FrameGrabber function present(const QVideoFrame &frame) is called when a frame is ready for display. It might be possible to get the data from the frame by copying the data at the bits() pointer into a QByteArray. This could then be retained in List<QByteArray*>, and the images could be constructed from the byteArrays. I have been unsuccessful at doing this.
QT += core gui multimedia multimediawidgets greaterThan(QT_MAJOR_VERSION, 4): QT += widgets TARGET = FrameGrabberTwo TEMPLATE = app SOURCES += main.cpp\ videoplayer.cpp \ framegrabber.cpp HEADERS += \ videoplayer.h \ framegrabber.h
framegrabber.h :
#ifndef FRAMEGRABBER_H #define FRAMEGRABBER_H #include <QAbstractVideoSurface> #include <QList> #include <QDebug> class FrameGrabber : public QAbstractVideoSurface { Q_OBJECT public: explicit FrameGrabber(QObject *parent = 0); ~FrameGrabber(); QList<QVideoFrame::PixelFormat> supportedPixelFormats(QAbstractVideoBuffer::HandleType handleType) const; bool present(const QVideoFrame &frame); signals: void frameAvailable(QImage frame); public slots: }; #endif // FRAMEGRABBER_H
videoplayer.h :
#ifndef VIDEOPLAYER_H #define VIDEOPLAYER_H #include <qmediaplayer.h> #include <QMediaPlayer> #include <QMediaPlayerControl> #include "framegrabber.h" #include <QtGui/QMovie> #include <QtWidgets/QWidget> #include <QDebug> QT_BEGIN_NAMESPACE class QAbstractButton; class QSlider; class QLabel; QT_END_NAMESPACE class VideoPlayer : public QWidget { Q_OBJECT public: VideoPlayer(QWidget *parent = 0); ~VideoPlayer(); FrameGrabber* grabber; int counter = 0; public slots: void openFile(); void play(); private slots: void mediaStateChanged(QMediaPlayer::State state); void positionChanged(qint64 position); void durationChanged(qint64 duration); void setPosition(int position); void handleError(); void handleFrame(QImage); private: void paintEvent(QPaintEvent * event); QMediaPlayer mediaPlayer; QAbstractButton *playButton; QSlider *positionSlider; QLabel *errorLabel; QImage frameImage; quint8 myred, mygreen, myblue; }; #endif
main.cpp :
#include "videoplayer.h" #include <QApplication> int main(int argc, char *argv[]) { QApplication a(argc, argv); VideoPlayer* player = new VideoPlayer(); player->setGeometry(300,300,640,480); player->show(); int result = a.exec(); delete player; return result; }
framegrabber.cpp :
#include "framegrabber.h" //---------- FrameGrabber ---------- FrameGrabber::FrameGrabber(QObject *parent) : QAbstractVideoSurface(parent) { qDebug()<<"CONSTRUCTOR FrameGrabber"; } //---------- ~FrameGrabber ---------- FrameGrabber::~FrameGrabber() { qDebug()<<"DESTRUCTOR FrameGrabber"; } //---------- supportedPixelFormats ---------- QList<QVideoFrame::PixelFormat> FrameGrabber:: supportedPixelFormats(QAbstractVideoBuffer::HandleType handleType) const { Q_UNUSED(handleType); return QList<QVideoFrame::PixelFormat>() << QVideoFrame::Format_ARGB32 << QVideoFrame::Format_ARGB32_Premultiplied << QVideoFrame::Format_RGB32 << QVideoFrame::Format_RGB24 << QVideoFrame::Format_RGB565 << QVideoFrame::Format_RGB555 << QVideoFrame::Format_ARGB8565_Premultiplied << QVideoFrame::Format_BGRA32 << QVideoFrame::Format_BGRA32_Premultiplied << QVideoFrame::Format_BGR32 << QVideoFrame::Format_BGR24 << QVideoFrame::Format_BGR565 << QVideoFrame::Format_BGR555 << QVideoFrame::Format_BGRA5658_Premultiplied << QVideoFrame::Format_AYUV444 << QVideoFrame::Format_AYUV444_Premultiplied << QVideoFrame::Format_YUV444 << QVideoFrame::Format_YUV420P << QVideoFrame::Format_YV12 << QVideoFrame::Format_UYVY << QVideoFrame::Format_YUYV << QVideoFrame::Format_NV12 << QVideoFrame::Format_NV21 << QVideoFrame::Format_IMC1 << QVideoFrame::Format_IMC2 << QVideoFrame::Format_IMC3 << QVideoFrame::Format_IMC4 << QVideoFrame::Format_Y8 << QVideoFrame::Format_Y16 << QVideoFrame::Format_Jpeg << QVideoFrame::Format_CameraRaw << QVideoFrame::Format_AdobeDng; } //---------- present ---------- bool FrameGrabber::present(const QVideoFrame &frame) { if (frame.isValid()) { QVideoFrame cloneFrame(frame); cloneFrame.map(QAbstractVideoBuffer::ReadOnly); const QImage image(cloneFrame.bits(), cloneFrame.width(), cloneFrame.height(), QVideoFrame::imageFormatFromPixelFormat(cloneFrame .pixelFormat())); //It appears that image is shallow, and does not actually generate a //new QImage. emit frameAvailable(image); //It might be possible to copy the bits() into a QByteArray and make it the parameter of //frameAvailable. It then would be used by the Slot to construct an image. I tried this but //I obviously screwed it up. It might be possible to use the QVideoProbe signal videoFrameProbed(const //QVideoFrame & frame) but I have not tried it. cloneFrame.unmap(); return true; } return false; }
videoplayer.cpp :
#include "videoplayer.h" #include <QtWidgets> #include <qvideowidget.h> #include <qvideosurfaceformat.h> VideoPlayer::VideoPlayer(QWidget *parent) : QWidget(parent) , mediaPlayer(0, QMediaPlayer::VideoSurface) , playButton(0) , positionSlider(0) , errorLabel(0) { qDebug()<<"CONSTRUCTOR VideoPlayer"; setAttribute(Qt::WA_QuitOnClose); grabber = new FrameGrabber(this); QVideoWidget *videoWidget = new QVideoWidget; //---------- QAbstractButton *openButton = new QPushButton(tr("Open...")); connect(openButton, SIGNAL(clicked()), this, SLOT(openFile())); playButton = new QPushButton; playButton->setEnabled(false); playButton->setIcon(style()->standardIcon(QStyle::SP_MediaPlay)); connect(playButton, SIGNAL(clicked()), this, SLOT(play())); positionSlider = new QSlider(Qt::Horizontal); positionSlider->setRange(0, 0); connect(positionSlider, SIGNAL(sliderMoved(int)), this, SLOT(setPosition(int))); errorLabel = new QLabel; errorLabel->setSizePolicy(QSizePolicy::Preferred, QSizePolicy::Maximum); QBoxLayout *controlLayout = new QHBoxLayout; controlLayout->setMargin(0); controlLayout->addWidget(openButton); controlLayout->addWidget(playButton); controlLayout->addWidget(positionSlider); QBoxLayout *layout = new QVBoxLayout; layout->addWidget(videoWidget); layout->addLayout(controlLayout); layout->addWidget(errorLabel); setLayout(layout); //---------- //By uncommenting this line, the video is played in the VideoPlayer window. //mediaPlayer.setVideoOutput(videoWidget); //This line sends mediaPlayer video output to the grabber mediaPlayer.setVideoOutput(grabber); //Playback at 1/3 speed to give time for frame capture mediaPlayer.setPlaybackRate(0.33); connect(&mediaPlayer, SIGNAL(stateChanged(QMediaPlayer::State)), this, SLOT(mediaStateChanged(QMediaPlayer::State))); connect(&mediaPlayer, SIGNAL(positionChanged(qint64)), this, SLOT(positionChanged(qint64))); connect(&mediaPlayer, SIGNAL(durationChanged(qint64)), this, SLOT(durationChanged(qint64))); connect(&mediaPlayer, SIGNAL(error(QMediaPlayer::Error)), this, SLOT(handleError())); //---------- //When the grabber has a video frame, it emits the frameAvailable signal. The VideoPlayer //catches the signal in the handleFrame slot. connect(grabber, &FrameGrabber::frameAvailable, this, &VideoPlayer::handleFrame); //---------- myred = 255; mygreen = 126; myblue = 0; } //---------- ~VideoPlayer ---------- VideoPlayer::~VideoPlayer() { qDebug()<<"DESTRUCTOR VideoPlayer"; } //---------- SLOT handleFrame ---------- void VideoPlayer::handleFrame(QImage image){ frameImage = image; if (counter++ == 3) qDebug()<<"width: "<<frameImage.width()<<" height: "<<frameImage.height(); //Code for List<QImage*>.append should be here. But, since frameImage has no data I have not //tried capturing the image here. //The update generates a call to paintEvent update(); } //---------- paintEvent ---------- void VideoPlayer::paintEvent(QPaintEvent * event){ QPainter painter(this); //This does not work. I wanted to draw the image just to see if it was intact at this point. painter.drawImage(QPoint(0,0),frameImage); //---------- myred += 3; mygreen += 5; myblue += 13; //This is just to show that paintEvent is being called and painter is working. painter.fillRect(QRect(20,20,30,30), QColor(myred, mygreen, myblue)); } //---------- openFile ---------- void VideoPlayer::openFile() { errorLabel->setText(""); QString fileName = QFileDialog::getOpenFileName(this, tr("Open Movie"),QDir::homePath()); if (!fileName.isEmpty()) { mediaPlayer.setMedia(QUrl::fromLocalFile(fileName)); playButton->setEnabled(true); } } //---------- play ---------- void VideoPlayer::play() { switch(mediaPlayer.state()) { case QMediaPlayer::PlayingState: mediaPlayer.pause(); break; default: mediaPlayer.play(); break; } } //---------- mediaStateChanged ---------- void VideoPlayer::mediaStateChanged(QMediaPlayer::State state) { switch(state) { case QMediaPlayer::PlayingState: playButton->setIcon(style()->standardIcon(QStyle::SP_MediaPause)); break; default: playButton->setIcon(style()->standardIcon(QStyle::SP_MediaPlay)); break; } } //---------- positionChanged ---------- void VideoPlayer::positionChanged(qint64 position) { positionSlider->setValue(position); } //---------- durationChanged ---------- void VideoPlayer::durationChanged(qint64 duration) { positionSlider->setRange(0, duration); } //---------- setPosition ---------- void VideoPlayer::setPosition(int position) { mediaPlayer.setPosition(position); } //---------- handleError ---------- void VideoPlayer::handleError() { playButton->setEnabled(false); errorLabel->setText("Error: " + mediaPlayer.errorString()); }
-
In the FrameGrabber::present(const QVideoFrame &frame) function, I changed the last image(...) parameter to
QVideoFrame::imageFormatFromPixelFormat(QVideoFrame::Format_RGB32)
This briefly generates some image panels in
VideoPlayer::paintEvent(QPaintEvent * event)
Then, the application crashes. But, at least this is some progress!
-
Hi! The documentation says
For format related functionality, you just have to describe the pixel formats that you support (and the nearestFormat() function). For presentation related functionality, you have to implement the present() function, and the start() and stop() functions.
Note: You must call the base class implementation of start() and stop() in your implementation.
Did you implement start() and stop() ?
-
@Wieland
Thanks Wieland! I am going thru the documentation. I am new to this and am a slow learner, but, hopefully will have the capture application functional in a few days (or weeks). In the meantime, I made some changes to the code, and it does capture the weird tiled images in a List<QImage*>.In videoplayer.h I added the QList<QImage*> and removed frameImage :
private: void paintEvent(QPaintEvent * event); QMediaPlayer mediaPlayer; QAbstractButton *playButton; QSlider *positionSlider; QLabel *errorLabel; QList<QImage*> imageList; quint8 myred, mygreen, myblue;
In videoplayer.cpp several changes were made. In the handleFrame Slot, a new image is created with `image' as a parameter. The pointer to this image is appended to imageList. The images are still screwed up, but at least they do store ok in imageList. imageList will generate a nasty memory leak upon quitting, and I need to write code in ~VideoPlayer to deal with it.
The application still crashes, and the images are still messed up, but it is making nice progress.//---------- SLOT handleFrame ---------- void VideoPlayer::handleFrame(QImage image){ //frameImage = image; QImage* imagePtr = new QImage(image); imageList.append(imagePtr); counter++; if ((counter - 3)%30 == 0){ qDebug()<<"width: "<<imageList.at(imageList.size()-1)->width() <<" height: "<<imageList.at(imageList.size()-1)->height(); imageList.at(imageList.size()-1)->save("/home/ken/Desktop/TestFolder/Test"+QString::number(counter),"PNG"); }//if //The update generates a call to paintEvent update(); }
In paintEvent, I removed the call to
//painter.drawImage(QPoint(0,0),frameImage); -
Hi,
You should check the format of your QVideoFrame, it might be using a format not supported by QImage. Second thing, why are you using a
QList<QImage *>
rather thanQList<QImage>
? -
@SGaist I put
qDebug()<<"pixelFormat: "<<cloneFrame.pixelFormat();
in
bool FrameGrabber::present(const QVideoFrame &frame)
and got line
pixelFormat: Format_YUV420PI am really having a difficult time understanding this format stuff. Will keep digging thru the documentation. As for the use of QList<QImage*> rather than QList<QImage>, it's just a proclivity. I am more comfortable with pointers.
-
Then you'll have to do a YUV to RGB conversion before saving the image.
-
@SGaist Searching the web yields one forum posting which looks like it might work for converting YUV data into RGB. Here it is verbatim:
Haha, I went through this 6 weeks ago, but I only needed the Y channel. The frame is not laid out like RGB. This is for legacy reasons. Black and White TV presented a black and white frame (Y) when color TV was added, it was added in a backwards-compatible way, with that data between the Y frames. SO what you have are actually 3 images per frame - the width x height B&W Y channel then a subsampled Cb and Cr. Cb and Cr as subsampled by a factor of two, meaning you have: [Y (width*height)] [Cb (width/2*height/2)] [Cr (width/2*height/2)] so your y_ is right, but u_ is at (width*height) + (y *(width/2)) + x/2 but v_ is at (width*height) + ((width*height)/4) + (y *(width/2)) + x/2 Yes, this means that U and V are used for 4 pixels, but the Y channel is pixel for pixel. ________________________________ From: Rayner Pupo <rpgomez at uci.cu> To: interest at qt-project.org Sent: Friday, March 14, 2014 11:29 AM Subject: [Interest] QVideoFrame and YUV question Hi, I'm trying to create a QImage from a QVideoFrame but my video has YUV420P format so converting it's not easy for me at all. By using a snippet from histogram class from the player example provided by Qt I was able to read each pixel on the frame but my question is: who can I decompose Y, Cb and Cr values from a single uchar? This is how to I'm iterating over the video frame bits. if (videoFrame.pixelFormat() == QVideoFrame::Format_YUV420P) { QImage nImage(videoFrame.width(), videoFrame.height(), QImage::Format_RGB32); uchar *b = videoFrame.bits(); for (int y = 0; y < videoFrame.height(); y++) { uchar *lastPixel = b + videoFrame.width(); int wIndex = 0; for (uchar *curPixel = b; curPixel < lastPixel; curPixel++) { double y_ = *curPixel;???????? double u_ = ???????????; double v_ = ???????????;` r = y_ + 1.402 * v_; g = y_ - 0.344 * u_ - 0.714 * v_; b = y_ + u_ * 1.772; nImage.setPixel(wIndex, y, qRgb(int(r), int(g), int(b))); ++wIndex; } b += videoFrame.bytesPerLine(); } return nImage; }
I have changed my testing program so that the mapped QVideoFrame.bits() are captured, and placed into a QByteArray*. This data is brought out as a parameter in an emitted signal, and it is available for conversion in the targeted slot. Currently, I am using this raw data to create a QImage, but this image is screwed up. It appears to have U and V coloration data at the top and Y data at the bottom. The new source is provided below in a single file. It is clunky and inelegant, but it should serve as a testbed for getting images from video frames. Hopefully, someone will take it and solve the problems and post the solution!
//================== FrameGrabberTwo.pro =================== QT += core gui multimedia multimediawidgets greaterThan(QT_MAJOR_VERSION, 4): QT += widgets TARGET = FrameGrabberTwo TEMPLATE = app SOURCES += main.cpp\ videoplayer.cpp \ framegrabber.cpp HEADERS += \ videoplayer.h \ framegrabber.h //================== framegrabber.h =================== #ifndef FRAMEGRABBER_H #define FRAMEGRABBER_H #include <QAbstractVideoSurface> #include <QList> #include <QDebug> #include <QTime> class FrameGrabber : public QAbstractVideoSurface { Q_OBJECT public: explicit FrameGrabber(QObject *parent = 0); ~FrameGrabber(); QList<QVideoFrame::PixelFormat> supportedPixelFormats(QAbstractVideoBuffer::HandleType handleType) const; bool present(const QVideoFrame &frame); QTime timer; int oldTime; signals: void frameAvailable(QImage frame); void hereAreBytes(QByteArray*, int, int, int); public slots: }; #endif // FRAMEGRABBER_H //================== videoplayer.h =================== class VideoPlayer : public QWidget { Q_OBJECT public: VideoPlayer(QWidget *parent = 0); ~VideoPlayer(); FrameGrabber* grabber; int counter = 0; QString s; QRect rect; public slots: void openFile(); void playPause(); private slots: void mediaStateChanged(QMediaPlayer::State state); void handleError(); void handleBytes(QByteArray*, int, int, int); void doClose(); private: void paintEvent(QPaintEvent * event); QMediaPlayer mediaPlayer; QPushButton *playButton; QLabel *errorLabel; QPushButton *closeButton; QPushButton *openButton; bool closeIt = false; QList<QByteArray*> bytesList; int numBytes; int vpWidth, vpHeight; }; #endif //================== main.cpp =================== #include "videoplayer.h" #include <QApplication> int main(int argc, char *argv[]) { QApplication a(argc, argv); VideoPlayer* player = new VideoPlayer(); player->setGeometry(300,300,480,300); player->show(); int result = a.exec(); delete player; return result; } //================== framegrabber.cpp =================== #include "framegrabber.h" //---------- FrameGrabber ---------- FrameGrabber::FrameGrabber(QObject *parent) : QAbstractVideoSurface(parent) { qDebug()<<"CONSTRUCTOR FrameGrabber"; timer.start(); } //---------- ~FrameGrabber ---------- FrameGrabber::~FrameGrabber() { qDebug()<<"DESTRUCTOR FrameGrabber"; stop(); } //---------- supportedPixelFormats ---------- QList<QVideoFrame::PixelFormat> FrameGrabber :: supportedPixelFormats(QAbstractVideoBuffer::HandleType handleType) const { Q_UNUSED(handleType); return QList<QVideoFrame::PixelFormat>() << QVideoFrame::Format_ARGB32 << QVideoFrame::Format_ARGB32_Premultiplied << QVideoFrame::Format_RGB32 << QVideoFrame::Format_RGB24 << QVideoFrame::Format_RGB565 << QVideoFrame::Format_RGB555 << QVideoFrame::Format_ARGB8565_Premultiplied << QVideoFrame::Format_BGRA32 << QVideoFrame::Format_BGRA32_Premultiplied << QVideoFrame::Format_BGR32 << QVideoFrame::Format_BGR24 << QVideoFrame::Format_BGR565 << QVideoFrame::Format_BGR555 << QVideoFrame::Format_BGRA5658_Premultiplied << QVideoFrame::Format_AYUV444 << QVideoFrame::Format_AYUV444_Premultiplied << QVideoFrame::Format_YUV444 << QVideoFrame::Format_YUV420P << QVideoFrame::Format_YV12 << QVideoFrame::Format_UYVY << QVideoFrame::Format_YUYV << QVideoFrame::Format_NV12 << QVideoFrame::Format_NV21 << QVideoFrame::Format_IMC1 << QVideoFrame::Format_IMC2 << QVideoFrame::Format_IMC3 << QVideoFrame::Format_IMC4 << QVideoFrame::Format_Y8 << QVideoFrame::Format_Y16 << QVideoFrame::Format_Jpeg << QVideoFrame::Format_CameraRaw << QVideoFrame::Format_AdobeDng; } //---------- present ---------- bool FrameGrabber::present(const QVideoFrame &frame) { if (frame.isValid()) { QVideoFrame cloneFrame(frame); cloneFrame.map(QAbstractVideoBuffer::ReadOnly); qDebug()<<"\npixelFormat: "<<cloneFrame.pixelFormat(); qDebug()<<"mappedBytes: "<<cloneFrame.mappedBytes(); int numBytes = cloneFrame.mappedBytes(); QByteArray* bytesPtr = new QByteArray((const char*)(cloneFrame.bits()), numBytes); int eTime = timer.elapsed(); qDebug()<<"time between frames "<<eTime - oldTime; oldTime = eTime; emit hereAreBytes(bytesPtr, numBytes, cloneFrame.width(), cloneFrame.height()); cloneFrame.unmap(); return true; } return false; } //================== videoplayer.cpp =================== #include "videoplayer.h" #include <QtWidgets> #include <qvideowidget.h> #include <qvideosurfaceformat.h> VideoPlayer::VideoPlayer(QWidget *parent) : QWidget(parent) , mediaPlayer(0, QMediaPlayer::VideoSurface) , playButton(0) , errorLabel(0) { qDebug()<<"CONSTRUCTOR VideoPlayer"; setAttribute(Qt::WA_QuitOnClose); grabber = new FrameGrabber(this); QVideoWidget *videoWidget = new QVideoWidget; videoWidget->setWindowTitle("Select a Video"); //---------- openButton = new QPushButton("Open...", this); connect(openButton, SIGNAL(clicked()), this, SLOT(openFile())); playButton = new QPushButton; playButton->setEnabled(false); playButton->setIcon(style()->standardIcon(QStyle::SP_MediaPlay)); connect(playButton, SIGNAL(clicked()), this, SLOT(playPause())); closeButton = new QPushButton("Close", this); closeButton->setEnabled(true); errorLabel = new QLabel; errorLabel->setSizePolicy(QSizePolicy::Preferred, QSizePolicy::Maximum); QBoxLayout *controlLayout = new QHBoxLayout; controlLayout->setMargin(0); controlLayout->addWidget(openButton); controlLayout->addWidget(playButton); controlLayout->addWidget(closeButton); QBoxLayout *layout = new QVBoxLayout; layout->addWidget(videoWidget); layout->addLayout(controlLayout); layout->addWidget(errorLabel); setLayout(layout); rect = QRect(200, 150, 150, 30); //---------- //By uncommenting this line, the video is played in the VideoPlayer window. //mediaPlayer.setVideoOutput(videoWidget); //This line sends mediaPlayer video output to the grabber mediaPlayer.setVideoOutput(grabber); //Playback at 1/10 speed to give time for frame capture mediaPlayer.setPlaybackRate(0.10); connect(&mediaPlayer, SIGNAL(stateChanged(QMediaPlayer::State)), this, SLOT(mediaStateChanged(QMediaPlayer::State))); connect(&mediaPlayer, SIGNAL(error(QMediaPlayer::Error)), this, SLOT(handleError())); connect(grabber, &FrameGrabber::hereAreBytes, this, &VideoPlayer::handleBytes); connect(closeButton, SIGNAL(clicked()), this, SLOT(doClose())); //---------- setWindowFlags(Qt::CustomizeWindowHint | Qt::WindowTitleHint); } //---------- ~VideoPlayer ---------- VideoPlayer::~VideoPlayer() { qDebug()<<"DESTRUCTOR VideoPlayer"; if ( ! bytesList.isEmpty()){ int testIndex = 5; QImage testImage = QImage((const uchar*)(bytesList.at(testIndex)), vpWidth, vpHeight, QImage::Format_RGB32, 0, 0); QString fileName = QFileDialog::getSaveFileName(this, "Save Image", "/home/ken/Desktop/testImag.png", "Images (*.png *.xpm *.jpg)"); testImage.save(fileName,"PNG"); QTime aDelay; aDelay.start(); while (aDelay.elapsed() < 2000){ int dummy = 0; dummy++; } }//if //---------- if ( ! bytesList.isEmpty()){ int num = bytesList.count(); for (int i=0; i<num; i++){ delete bytesList.at(i); }//for }//if } //---------- SLOT handleBytes ---------- void VideoPlayer::handleBytes(QByteArray* bytes, int nBytes, int w, int h) { bytesList.append(bytes); numBytes = nBytes; vpWidth = w; vpHeight = h; if (closeIt) { QCoreApplication::quit(); } counter++; s.clear(); s = "Frame: " + QString::number(counter); update(); } //---------- SLOT doClose ---------- void VideoPlayer::doClose() { qDebug()<<" doClose"; if (mediaPlayer.state() == QMediaPlayer::StoppedState){ QCoreApplication::quit(); } if ((mediaPlayer.state() == QMediaPlayer::PlayingState) || (mediaPlayer.state() == QMediaPlayer::PausedState)){ mediaPlayer.stop(); } closeIt = true; } //---------- paintEvent ---------- void VideoPlayer::paintEvent(QPaintEvent * event){ QPainter painter(this); painter.eraseRect(rect); painter.drawText(rect, 0, s); } //---------- openFile ---------- void VideoPlayer::openFile() { errorLabel->setText(""); QString fileName = QFileDialog::getOpenFileName(this, "Open Movie",QDir::homePath()); if (!fileName.isEmpty()) { mediaPlayer.setMedia(QUrl::fromLocalFile(fileName)); playButton->setEnabled(true); } } //---------- SLOT playPause ---------- void VideoPlayer::playPause() { switch(mediaPlayer.state()) { case QMediaPlayer::PlayingState: mediaPlayer.pause(); break; default: mediaPlayer.play(); break; } } //---------- mediaStateChanged ---------- void VideoPlayer::mediaStateChanged(QMediaPlayer::State state) { switch(state) { case QMediaPlayer::PlayingState: playButton->setVisible(false); openButton->setVisible(false); break; default: playButton->setIcon(style()->standardIcon(QStyle::SP_MediaPlay)); break; } } //---------- handleError ---------- void VideoPlayer::handleError() { playButton->setEnabled(false); errorLabel->setText("Error: " + mediaPlayer.errorString()); }
-
This is really nice project. But this gives me data in png file and formate is YUV. can anyone tell me how to save frame data directly into RGB32 file formate?
-
Hi and welcome to devnet,
What kind of RGB32 format do you have in mind ?
In any case, if you get YUV data then you need to do a colour space conversion before saving.
-
Thanks for reply.I want RGB32 pixel formate. Because I want to display my video on TI EVM's frame buffer.which only allows RGB 32 bit formate.
it cann't give me RGB32 directly?
Due to which syntax or argument I am getting this YUV420? -
Your device might be providing images in that format. QtQuick (since 5.7 or 5.8) at least is capable to show these directly.
-
thanks for reply.I am new to Qt. but right now first I am making a project in my linux desktop system.Then I will cross compile it for target board.thats a later part. but now I want RGB frame image. What and where to change in code?please suggest me something
-
In this program I am changing the test Index but still I am getting the same frame?
frame_capture.pro
#-------------------------------------------------
#Project created by QtCreator 2017-02-15T11:45:09
#-------------------------------------------------QT += core gui multimedia multimediawidgets
greaterThan(QT_MAJOR_VERSION, 4): QT += widgets
TARGET = frame_capture
TEMPLATE = appSOURCES += main.cpp
frame_capture.cpp
v_player.cppHEADERS += frame_capture.h
v_player.hframe_capture.h
#ifndef FRAME_CAPTURE_H
#define FRAME_CAPTURE_H#include <QAbstractVideoSurface>
#include <QList>
#include <QDebug>
#include <QTime>namespace Ui {
class frame_capture;
}class frame_capture : public QAbstractVideoSurface
{
Q_OBJECTpublic:
explicit frame_capture(QObject *parent = 0);
~frame_capture();
QListQVideoFrame::PixelFormat supportedPixelFormats(QAbstractVideoBuffer::HandleType handleType) const;
bool present(const QVideoFrame &frame);
QTime timer;
int oldTime;signals:
void frameAvailable(QImage frame);
void hereAreBytes(QByteArray*, int, int, int);
};#endif // FRAME_CAPTURE_H
v_player.h
#ifndef V_PLAYER_H
#define V_PLAYER_H
#include "frame_capture.h"#include <QObject>
#include <QMediaPlayer>
#include <QVideoWidget>
#include <QtWidgets>class v_player : public QWidget
{
Q_OBJECT
public:
v_player(QWidget *parent = 0);
~v_player();
frame_capture *capture;int counter = 0; QString s; QRect rect;
private slots:
void handleBytes(QByteArray*, int, int, int);
void doClose();private:
void paintEvent(QPaintEvent *event);QMediaPlayer mediaPlayer; QPushButton *closeButton; bool closeIt = false; QList<QByteArray*> bytesList; int numBytes; int vpWidth, vpHeight; QList<QImage*> imageList; quint8 myred, mygreen, myblue;
};
#endif // V_PLAYER_Hframe_capture.cpp
#include "frame_capture.h"
#include "v_player.h"frame_capture::frame_capture(QObject *parent) : QAbstractVideoSurface(parent)
{
qDebug()<<"CONSTRUCTOR";
timer.start();
}//---------- supportedPixelFormats ----------
QListQVideoFrame::PixelFormat frame_capture ::
supportedPixelFormats(QAbstractVideoBuffer::HandleType handleType) const
{
Q_UNUSED(handleType);
return QListQVideoFrame::PixelFormat()
<< QVideoFrame::Format_ARGB32
<< QVideoFrame::Format_ARGB32_Premultiplied
<< QVideoFrame::Format_RGB32
<< QVideoFrame::Format_RGB24
<< QVideoFrame::Format_RGB565
<< QVideoFrame::Format_RGB555
<< QVideoFrame::Format_ARGB8565_Premultiplied
<< QVideoFrame::Format_BGRA32
<< QVideoFrame::Format_BGRA32_Premultiplied
<< QVideoFrame::Format_BGR32
<< QVideoFrame::Format_BGR24
<< QVideoFrame::Format_BGR565
<< QVideoFrame::Format_BGR555
<< QVideoFrame::Format_BGRA5658_Premultiplied
<< QVideoFrame::Format_AYUV444
<< QVideoFrame::Format_AYUV444_Premultiplied
<< QVideoFrame::Format_YUV444
<< QVideoFrame::Format_YUV420P
<< QVideoFrame::Format_YV12
<< QVideoFrame::Format_UYVY
<< QVideoFrame::Format_YUYV
<< QVideoFrame::Format_NV12
<< QVideoFrame::Format_NV21
<< QVideoFrame::Format_IMC1
<< QVideoFrame::Format_IMC2
<< QVideoFrame::Format_IMC3
<< QVideoFrame::Format_IMC4
<< QVideoFrame::Format_Y8
<< QVideoFrame::Format_Y16
<< QVideoFrame::Format_Jpeg
<< QVideoFrame::Format_CameraRaw
<< QVideoFrame::Format_AdobeDng;
}//---------- present ----------
bool frame_capture::present(const QVideoFrame &frame)
{
if (frame.isValid())
{
QVideoFrame cloneFrame(frame);
cloneFrame.map(QAbstractVideoBuffer::WriteOnly);
cloneFrame.pixelFormatFromImageFormat(QImage::Format_RGB32);int numBytes = cloneFrame.mappedBytes(); QByteArray* bytesPtr = new QByteArray((const char*)(cloneFrame.bits()), numBytes); emit hereAreBytes(bytesPtr, numBytes, cloneFrame.width(), cloneFrame.height()); cloneFrame.unmap(); return true; qDebug()<<"\npixelFormat: "<<cloneFrame.pixelFormat(); qDebug()<<"mappedBytes: "<<cloneFrame.mappedBytes(); } return false;
}
frame_capture::~frame_capture()
{
qDebug()<<"DESTRUCTOR";
}main.cpp
#include <QApplication>
#include "v_player.cpp"int main(int argc, char *argv[])
{
QApplication a(argc, argv);
v_player * player = new v_player();
player->setGeometry(300,300,480,300);
player->show();
int result = a.exec();
delete player;
return result;
}v_player.cpp
#include "v_player.h"
v_player::v_player(QWidget *parent) : QWidget(parent) , mediaPlayer(0, QMediaPlayer::VideoSurface)
{qDebug()<<"CONSTRUCTOR VideoPlayer"; setAttribute(Qt::WA_QuitOnClose); capture = new frame_capture(this); QVideoWidget *videoWidget = new QVideoWidget; videoWidget->setWindowTitle("Select a Video"); closeButton = new QPushButton("Close", this); closeButton->setEnabled(true); QBoxLayout *controlLayout = new QHBoxLayout; controlLayout->setMargin(0); controlLayout->addWidget(closeButton); controlLayout->addWidget(videoWidget); setLayout(controlLayout); rect = QRect(002, 25, 150, 30); //This line sends mediaPlayer video output to the grabber mediaPlayer.setVideoOutput(capture); //By uncommenting this line, the video is played in the VideoPlayer window. //mediaPlayer.setVideoOutput(videoWidget); //Playback at 1/10 speed to give time for frame capture //mediaPlayer.setPlaybackRate(0.1); mediaPlayer.setPlaybackRate(0.1); QString fileName = "/home/einfochips/Downloads/Countdown_Timer_Background _10_Seconds.mpg"; if (!fileName.isEmpty()) { mediaPlayer.setMedia(QUrl::fromLocalFile(fileName)); //playButton->setEnabled(true); mediaPlayer.play(); } connect(capture, &frame_capture::hereAreBytes, this, &v_player::handleBytes); connect(closeButton, SIGNAL(clicked()), this, SLOT(doClose())); setWindowFlags(Qt::CustomizeWindowHint | Qt::WindowTitleHint);
}
//---------- ~VideoPlayer ----------
v_player::~v_player()
{
qDebug()<<"DESTRUCTOR VideoPlayer";if ( ! bytesList.isEmpty()) { int testIndex = 150; QImage testImage = QImage((const uchar*)(bytesList.at(testIndex)), vpWidth, vpHeight, QImage::Format_RGB32, 0, 0); QString fileName = QFileDialog::getSaveFileName(this, "Save Image", "/home/einfochips/sitara/AM335x/Qt5.7.0/prog/frame_capture/build-frame_capture-Desktop_Qt_5_7_0_GCC_64bit-Debug/frame/testimage.png"); testImage.save(fileName,"PNG"); QTime aDelay; aDelay.start(); while (aDelay.elapsed() < 500) { int dummy = 0; dummy++; } } if ( ! bytesList.isEmpty()) { int num = bytesList.count(); for (int i=0; i<num; i++) delete bytesList.at(i); }
}
//---------- SLOT handleBytes ----------
void v_player::handleBytes(QByteArray* bytes, int nBytes, int w, int h)
{bytesList.append(bytes); numBytes = nBytes; vpWidth = w; vpHeight = h; if (closeIt) QCoreApplication::quit(); counter++; s.clear(); s = "Frame: " + QString::number(counter); qDebug() << counter << endl; update();
}
//---------- SLOT doClose ----------
void v_player::doClose()
{
qDebug()<<" doClose";
if (mediaPlayer.state() == QMediaPlayer::StoppedState)
QCoreApplication::quit();
if ((mediaPlayer.state() == QMediaPlayer::PlayingState) || (mediaPlayer.state() == QMediaPlayer::PausedState))
mediaPlayer.stop();
closeIt = true;
}//---------- paintEvent ----------
void v_player::paintEvent(QPaintEvent * event)
{
QPainter painter(this);
painter.eraseRect(rect);
painter.drawText(rect, 0, s);
} -
Please, are you expecting us to analyse your complete application to find what index you may or may not modify ?
-
no sir not really. The basic code is same as posted in this thread previously. This is just for the reference.You can directly suggest me which api or function will change the pixel formate of frame to RGB and allow me to save direct RGB only? or manually I need to add logic for conversion?
and the other thing I am not getting is why this gives me same frame? or I would say 1st frame only... -
It depends on the output you'll be using e.g. if OpenGL a shader will be used for that conversion.
Note that your grabber looks strange, you return all the types rather than just the one you would like to have and then you map the video frame as write only but you only read from it.
-
Thank you for reply. I tried as you say. but didn't get any change in output.what I do now?
k if you can suggest me any thing which can change frame output(regardless of its formate) rather than being same all the time at any instance i capture a frame.Output is going to same and it is equal to first frame. for instance at least if i'll get different pictures for different frames than for formate i'll convert manually,just for now. -
For converting YUV to RGB, Qt has a private API , qt_imageFromVideoFrame() , which could convert QVideoFrame to QImage in RGB format (I can't remember the exact RGB format to be converted) .
p.s It can't handle texture frame
You could try and see could it solve your problem first.
To enable the private API, you have to add this line in your .pro file
QT += multimedia-private
And include this header file
#include "private/qvideoframe_p.h"
Remarks: That is a private API. Use it as your own risk
-
thank you for reply.I tried it but not working. I tried it after my clone frame syntax(in framegrabber.cpp file function present). Is that write position? I am new to Qt. I dont have much idea about it.