Important: Please read the Qt Code of Conduct - https://forum.qt.io/topic/113070/qt-code-of-conduct

How to processing multiple camera/video by a filter in qml/qt?



  • I have multiple Camera/video and i want to processing all concurrently by a unique filter (QAbstractVideoFilter) . I need show processed frame results separately too. It means that my filterFrame should know about each frame that have send by different Camera/video. how to implement this idea ? i think i have to send a parameter to filter from each VideoOutput like code below but how to do that ?

    MediaPlayer {
           id: mediaPlayer1
           source: "video1.mp4"
       }
    VideoOutput {
           id: videoOutput1
           source: mediaPlayer1
           filters: [FilterFrame](parameterVideo1)
    }
    ————————
    MediaPlayer {
           id: mediaPlayer2
           source: "video2.mp4"
       }
    VideoOutput {
           id: videoOutput2
           source: mediaPlayer2
           filters: [FilterFrame](parameterVideo1)
    }
    

  • Qt Champions 2018

    You can't do that with a QAbstractVideoFilter

    The solution would be to instantiate multiple QAbstractVideoFilter and link them to your own object doing the processing.

    MultipleVideoProcessor {
        id: videoProcessor
    }
    MediaPlayer {
        id: mediaPlayer1
        source: "video1.mp4"
    }
    VideoOutput {
        id: videoOutput1
        source: mediaPlayer1
        filters: CustomVideoFilter {
            processor: videoProcessor
            parameter: "video1"
        }
    }
    ————————
    MediaPlayer {
        id: mediaPlayer2
        source: "video2.mp4"
    }
    VideoOutput {
        id: videoOutput2
        source: mediaPlayer2
        filters: CustomVideoFilter {       
            processor: videoProcessor       
            parameter: "video2"
         }
    }
    

    The filterRunnables of the filters will just proxy the frame to the actual processor, giving it an additional parameter.
    Something like this:

    void CustomVideoFilterRunnable::run(QVideoFrame *input, const QVideoSurfaceFormat &surfaceFormat, QVideoFilterRunnable::RunFlags flags)
    {
        m_processor->run(input, surfaceFormat, flags, m_parameter);
    }
    


  • something is not clear for me in your sample code, please explain little more about where i can access to (parameter: "video1") in filter class, and where and how i implement MultipleVideoProcessor class in relation to CustomVideoFilter ? because i need multiple threads to run each video in filters. thanks again


  • Qt Champions 2018

    please explain little more about where i can access to (parameter: "video1") in filter class

    You declare a new type in C++ inheriting from QAbstractVideoFilter with a processor Q_PROPERTY with your processor class as a type, a parameter Q_PROPERTY (or multiple) with whatever you need as a type and expose it to QML.

    how i implement MultipleVideoProcessor class in relation to CustomVideoFilter ? because i need multiple threads to run each video in filters.

    Like in my example, you declare a function similar to QAbstractVideoFilter::run but with the parameters you need for each video in addition.
    The threading should be handled by the multimedia framework, meaning that your different Filter run methods will be called in different threads. Just make sure your Processor class is thread safe.


Log in to reply