Important: Please read the Qt Code of Conduct -

Help in creating custom MediaPlayer component

  • I have been trying to create a custom player for iOS but the surface creation proved to be quite complicated to understand, so I was thinking if it's possible to just create a custom MediaPlayer that we pass to QML-s VideoOutput component, so that it handles the video render for me.

    Something like this:

    import QtQuick 2.9
    import QtQuick.Window 2.3
    import QtQuick.Controls 2.2
    import QtMultimedia 5.9
    import QtQuick.Layouts 1.3
    //import MyCustomPlayer 1.0
    ApplicationWindow {
        visible: true
        visibility: Window.FullScreen
        // this would be replaced with my custom MediaPlayer defined in "MyCustomPlayer 1.0"
        MediaPlayer { 
            id: player
            source: ""
            autoPlay: true
        VideoOutput {
            id: videoOutput
            source: player
            anchors.fill: parent

    This way I would just pass my custom player to VideoOutput.

    Since the custom player should be quite similar to the original MediaPlayer it self, I was looking at the source code from qtmediaplayer trying to understand how the implementation works.

    I know how to make a basic component using qmlRegisterType but what are the minimum steps to have the component working (receive a video source and passing it to VideoOutput).
    As far as I understand the majority of the code for iOS player is placed in avfoundation folder.

    Or is it possible to include <QtMultimedia/qmediaplayer.h> and override some of its functionality to create the custom component?

    If anyone has done something similar it would greatly help.

  • Lifetime Qt Champion


    What would your custom player do ?

  • @SGaist
    For the most part it would function same as the MediaPlayer, but I require additional functionality for widevine drm, so that would be the custom part.

    Even getting the original MediaPlayer as a separate component for my project would greatly help.

  • Lifetime Qt Champion

    At what level is that technology integrated ? Shouldn't that be through the OS frameworks ?

  • It's on lower lvl, requiring a localhost to run which transforms mpd streams to hls because native iOS does not support mpd format, which is then sent to the player.

    But that's not the issue here, just getting a separate component like MediaPlayer would solve all integration problems.

  • Lifetime Qt Champion

    In that case, you should take a look at the plugins in QtMultimedia's sources, you have several implementations that you can take inspiration from thus integrating directly in the pipeline rather that having a custom item.

  • I assume with integration into the pipeline you mean to compile it with the changes?
    If that's the case, then it's out of question due to complexity of the integration itself.
    I managed to get native iOS player in a custom component so I'll continue with that.

  • Lifetime Qt Champion

    No, I mean to make it a plugin so it can be loaded to play your custom type.

  • How would such an approach look like? So far I haven't done anything with plugins.
    If it's possible to modify avplayer asset and delegate from original MediaPlayer I'd like to try it, but I would need some guide on how to do it.

  • Lifetime Qt Champion

    Did you take a look at the QtMultimedia media player plugins ?

  • Yes I did, but I never made anything like this, so not much that i understand from there.

  • Lifetime Qt Champion

    Then study the gstreamer implementation for example and start from there.

    The pattern is to have a "session class" that does the work and then provide all the interfaces you support that will be using said session class.