Modify live camera stream and send to external entity without using GUI.
First time using QT and developing an app on a unix system, so here goes.
I'm working on a raspberry pi 2 with raspbian/debian wheezy installed and need to modify a live camera stream by adding simple overlay before sending the camera stream out to an external entity through composite out on the RPi2. I've used VideoOutput in QML to capture and modify the stream, but I don't know where to go from there as i do not know if what i want is even possible using QT.
I do not want the camera stream to be part of an application window as the point is to stream the modified camera video, not to stream the application itself, if that makes any sense. In theory, the app will present a datastream with the modified camera video where i can use gstreamer or something to send the modified live camera stream through comp out.
Is this possible using QT?
If yes, where do i need to look?
If no, what should i use instead?
Please ask if something is unclear, im horrid at explaining things :/
Hi and welcome to devnet,
You should rather have a look at QtGstreamer for that task. With it you'll be able to build the stream pipeline you need without GUI
Note that that module is not part of Qt.
Hope it helps
I'm not sure if it's a bad idea, but I would try to derive QAbstractVideoSurface from where I have acces to each frame. In AbstractVideoSurface::present(const QVideoFrame & frame) it should be possible to add the overlay.
How can you stream the video through comp out without an application window? Can you do it by tranferring the image data into a special buffer?
I should probably have mentioned it, but what i wanted was to connect 2 screens to the RPi2. I then wanted to watch modified camera stream on the screen connected to the comp port, and the pi desktop through the other screen connected to HDMI.
However, connecting a screen on both the HDMI port and the comp post is supposedly not even possible on the pi. Alernatives would be connecting a 2nd screen through USB, but i figured that this is too much for what i have both the time to do and the need to do. Being able to attach 2 screens was more like a bonus.
The modified camera stream is to be used by another screen and another application as if it was a video stream, hence why i wanted as few elements connected to the modified camera stream as possible.
I haven't got as far as trying QtGstreamer yet as i had qt5 installed on the pi and QtGstreamer needs qt4 and a bunch of other libs, which ended up in the pi going bonkers as i discovered the qt4 requirement just a tad bit too late. I formated the SD card and have downloaded qt4 + QtGstreamer again, will hopefully be able to try it out tomorrow.
I believe i need to make a pipeline to the camera device and then make a surface where i attach the pipeline together with qml. Theoretically. Accessing each frame one by one sounds like something that could slow down the fps, but for all i know, this is exactly what happens in the background anyways when attaching pipeline together with qml on a surface. I will look into it more.
I'm not sure how a surface would differ from a normal GUI application window, but I'm hoping the surface is merely something needed to attach video to. Sure, i could make borderless GUI, but from the sound of it, a surface feels more like what i want.
I haven't read about it since yesterday though, and that is a long time ago, so excuse the possibly incorrect assumptions i just made.
QtGstreamer builds with Qt 5 as well. I did it last year for the Pi
But i assume you have to download it from a repository and compile it yourself? Downloading and compiling qt5 takes forever as is, so if what i want to do can be done on simple apt-get-calls with qt4, that's enough for me. I will try it, and if it doesn't work I'll move on to qt5 again :)
This post is deleted!
Indeed native compiling of Qt 5 is a bit long, better to cross-compile it
So is there any magic involved in getting a decent fps? I am aware that this is more related to QtGstreamer and RPi2 forums, but I've got nothing to lose trying here too as i assume others have been here before me.
The code I've used just to verify that it works is from https://github.com/shadeslayer/qtgstreamer/tree/master/examples/qmlplayer. I removed the OpenGL part as it slowed the fps down to 1-2, and i removed the buttons to make it simpler.
A tl;dr of the code is:
- QtGstreamer 0.10
- Gstreamer 0.10
- QML linked to surface via sink
- playbin2 pipeline
- stream directly from RPi camera on /dev/video0
Ive been having a hard time finding out what kind of element name would work when generating a pipeline, especially since I'm using Gstreamer 0.10, so I'm thinking the cause of the low FPS may be related a mix of the element name, the pipeline properties, and the (Qt)GStreamer version?
Also, does QtGstreamer 0.10 support black/white colors?