Important: Please read the Qt Code of Conduct -

Is it possible to have another process writing to a QPixmap (when running on X11)? Or how would you transfer video frames between the processes?

  • We are trying to write a SkypeKit video client and the API unfortunately doesn't provide a built-in way for pushing decoded frames through the runtime to your client. Videohost can create its own window, but we want to get frames back to our QML application to draw all kinds of QML buttons on top.

    We are trying to create an invisible QPixmap and ask video host to draw on it using X11 routines (QPixmap on X11 systems has X11PictureHandle). Unfortunately on the way we experience BadDrawable, segmentation faults, etc.

    Is there anybody who tried inter-process image transfer?
    Is it possible to ask one process to draw to an X11 window of another process?
    Or what could be another way for transfering the image?

  • I think you can use "QSharedMemory": .

  • I was thinking about it, just wasn't sure about two things:

    • Is it going to be fast enough for two video streams (incoming and preview)

    • How complex it is to synchronize the updates. Video source is not a Qt-app and its build process is too complex for me to bring Qt in, so we'd probably have to use system Mutexes and make them cooperate with QMutexes on the receiving side or something like that.

    Any advice on these?

  • Than what is the way your app asks server to draw? Maybe this way is fast enough for server to send your app drawn QPixmap.

  • The reference videohost that I am basing on just creates his own X11 demo window (I think OpenGL example exists as well) and paints to it. What I am trying to do is to make it give me the rendered pixels instead so that I could paint them within QDeclarativeItem, put buttons and semi-transparent notifications on top.

    So we are exploring a couple of ways right now:

    1. Ask server to paint to our own invisible window.
      That is the only reason for QPixmap use. Once pixels are there, I can do with them whatever

    2. Transmit rendered frames using interprocess communication (QSharedMemory?)

    Things are a little more complicated, because server is not using Qt and its build process is too complex for me to inject Qt in, so we got to use some POSIX-compliant and not Qt-specific way of communication.

  • I've done something similar to your task a couple of years ago. As I understand you have access to host's sources. You can create TCP or UDP connection for communications, define some message types for requests and replies and transfer TIFF images. It's rather simple format so it won't make any difficulties.

  • Thanks for advice, p-himik.

    I have access to videohost's sources indeed. They are too complex and non-Qt to me for the complex stuff, but I am able to perform small modifications. I also have access to Qt code for painting bits on the client from the picture format host provides so I don't even need the TIFF conversion, just need to figure how to transfer array of bytes.

    As for TCP/UDP, do you think it can be fast enough for transferring two uncompressed full-HD streams even though on the same machine (Terga 2 based)? I was thinking that shared memory could be faster and a little more tolerant to delays (client can just skip some frames if it's not fast enough).

  • UDP is slightly faster then TCP but less reliable. If your plan to use videohost from another machine TCP is the best choice. And if not, QSharedMemory::nativeKey() can help you (though I didn't use it yet) to use POSIX API.
    Though I don't know most of the things I think that using Qt code in videohost's sources is the fastest and the most reliable way. You don't need an event loop to use with QSharedMemory. All you need is to link videohost with QtCore module and start using QSharedMemory.

  • Hmm, don't you need to integrate moc stuff into build somehow? Or is it needed only if you add some QPROPERTies yourself?

  • It is certainly possible to get one X11 process to draw into a pixmap created by another by passing the Pixmap handle around, and I've written test code that does this. Unfortunately for video drivers etc. such as TV capture cards, I found that whilst the video4linux api specifies a Drawable which means that both a Window handle and a Pixmap handle should work, in many cases only a Window handle is actually supported by the drivers. Since sharing pixmaps between processes requires addition work anyway to ensure that you avoid race conditions, I doubt it is a good solution to your problem.

Log in to reply