Important: Please read the Qt Code of Conduct - https://forum.qt.io/topic/113070/qt-code-of-conduct

Blackscreen when using void QMediaPlayer::setVideoOutput(const QVector<QAbstractVideoSurface *> &surfaces)



  • Hi Guys,

    currently I want to show 4 videos on 4 different Displays. The videos should be synchronized. Therefore I created a QVideoWidget for every Display stored in

          QList<QVideoWidget*> vws = {}; 
    

    Then I created the QVector with the QAbstractVideoSurfaces

    QVector<QAbstractVideoSurface*> *surfaces = new QVector<QAbstractVideoSurface*>;
    
            for(int i = 0; i < qApp->screens().count(); i++){
                if( i != displayToUse_NotBlocking){
                    VideoSurface *vs = new VideoSurface(vws.at(i)->videoSurface());
    
                    QVideoSurfaceFormat format(qApp->screens().at(i)->geometry().size(), QVideoFrame::Format_RGB32);
    
                    vws.at(i)->show();
                    vws.at(i)->showFullScreen();
            
                    vs->start(format);
                    surfaces->append(vs);
    
                    qDebug()<< vs->surfaceFormat();
                }
            }
    

    Here VideoSurface is the class implementing the QAbstractVideoSurface

    #ifndef VIDEOSURFACE_H
    #define VIDEOSURFACE_H
    
    #include <QtMultimedia>
    #include <QAbstractVideoSurface>
    #include <QVideoFrame>
    #include <QAbstractVideoBuffer>
    #include <QLabel>
    #include <QApplication>
    class VideoSurface : public QAbstractVideoSurface
    {
    public:
    
        VideoSurface(QObject *parent = nullptr): QAbstractVideoSurface(parent) {};
    
        QList<QVideoFrame::PixelFormat> supportedPixelFormats(
                QAbstractVideoBuffer::HandleType handleType) const
        {
            if (handleType == QAbstractVideoBuffer::NoHandle) {
                QList<QVideoFrame::PixelFormat> *list = new QList<QVideoFrame::PixelFormat>;
                list->append(QVideoFrame::Format_YUV420P);
                list->append(QVideoFrame::Format_RGB24);
                list->append(QVideoFrame::Format_BGR32);
                list->append(QVideoFrame::Format_BGR24);
                list->append(QVideoFrame::Format_RGB32);
    
                return *list;
            } else {
                return QList<QVideoFrame::PixelFormat>();
            }
        }
    
        // nothing is done with the image in here
        bool present(const QVideoFrame &frame)
        {
            qDebug() << frame.pixelFormat();
    
    
    
            return true;
            }
    
    };
    
    
    
    
    
    #endif // VIDEOSURFACE_H
    

    Then I added the output to a previously created QMediaPlayer and started the video.

    globalplayer->setVideoOutput(*surfaces);
            globalplayer->setMedia(QUrl::fromLocalFile("MyPath"));
    
            qDebug() << "isVideoAvailable: " << globalplayer->isVideoAvailable();
            qDebug() << ": " << globalplayer->mediaStatus();
            globalplayer->play();
    

    Now I only see a black screen. But a frame is obviously grabbed in the present() function. (qDebug Output here: Format_RGB32)

    Has anyone an idea what I#M doing wrong ? Do i have to process the frame ? If only use globalplayer->setVideoOutput(QVideoWidget) everything works fine.

    Kind regards,

    Kobe

    PS: If there is a way to show 4 different videos in sync, pls let me know


  • Lifetime Qt Champion

    Hi and welcome to devnet,

    Any chance that your video returns YUV images ? If memory serves well that format is currently only rendered in QtQuick.

    On a side note, there's no need to allocate QList objects on the heap.



  • Hi thanks for your reply!!

    I converted the image using ffmpeg into YUV420p, which unfortunatly doesn't work. (isVideoAvailable: false, DirectShowPlayerService::doRender: Unresolved error code 0x80040266 ()). Do you know another way to play Videos in Sync ? How about using QThreads or QtConcurrent ?


  • Lifetime Qt Champion

    Neither will help.

    As written before, beside the platform limitations, to the best of my knowledge, you can only get YUV in QtQuick.

    Are you doing any pre-processing with ffmpeg beside converting to YUV ?

    Because depending on your file getting the exact same frame at the same time for different videos will be highly complicated. Video files are not video reels where you always have 24/25 images per second. Depending on the format, compression, video content, etc, you will have a various number of full frames the rest being "created o the fly" by the decoder.



  • @SGaist No, I'm just converting. Thanks for your help!



  • Hi, why do you write your own VideoSurface class instead of just using vws.at(i)->videoSurface()?



  • @Bonnie I tried this before and it didn't work. So i thought i might have to implement supportedPixelFormats() and present().



  • @kk17baba
    It works for me though. Can you post your code when trying that?
    You shouldn't write your own if you want use QVideoWidget, otherwise you should also write your own widget for displaying the video (which is far more complicated).



  • I already gave up on this approach... Now I'm generating a QVideoWidget which is as wide and high as the full virtual Desktop and I'm just playing one video. Before that I'm generating a video which is exact the width and height of the virtual desktop.


Log in to reply