Analog cameras, qt and gstreamer
-
I'm trying to display video recorded from some analog cameras (PAL video). They're controled by a v4l2 driver which produces raw video buffers (NV12 format). I want to display it using QML + QT Quick. My system is a custom linux built with buildroot, for a custom board with an ARM processor.
I've tried a simple program and also declarative-camera, but they're not working. At first, I got this output:
(app-test:221): GStreamer-WARNING **: Name 'video-encodebin' is not unique in bin 'camerabin', not adding (app-test:221): GStreamer-WARNING **: Name 'videobin-filesink' is not unique in bin 'camerabin', not adding (app-test:221): GStreamer-WARNING **: Name 'image-encodebin' is not unique in bin 'camerabin', not adding (app-test:221): GStreamer-WARNING **: Name 'imagebin-filesink' is not unique in bin 'camerabin', not adding (app-test:221): GStreamer-WARNING **: Name 'viewfinderbin-queue' is not unique in bin 'camerabin', not adding CameraBin error: "Your GStreamer installation is missing a plug-in."
Running
gst-launch-1.0 camerabin
it complains about lacking ogg muxer. Then, I enable ogg from buildroot, launch the program again and I get:(app-test:241): GLib-GObject-CRITICAL **: g_object_unref: assertion 'G_IS_OBJECT (object)' failed CameraBin error: "Device '/dev/video1' cannot capture in the specified format" CameraBin error: "Internal data stream error."
But I wonder if this kind of cameras are even supported by QML Camera item, or if only digital cameras are.
I also need to have a bit more control over the cameras. For example, this V4L2 driver have multiple inputs you can choose from, and I need to choose it. Can I do it?
I also need to ensure a good performance avoiding overhead because video conversions, so I'd rather to use the raw format I get from the driver if possible. I've read somewhere that VideoOutput supports some raw formats, but I don't know which ones and how to use it. Does it support NV12? How can I make to get the NV12 video stream from the driver and show it in a VideoOutput element?
I suspect that QT Multimedia doesn't allow me to do all this. So, do I have to use Gstreamer directly? And how I should use it? I've seen that there exists QTGstreamer but it seems to be deprecated, can I still use it to output my video to a QML surface? Or do I have to use the new qmlglsink? In that case, how do I have to use it? I can't find any example.
Please consider that I'm just learning all about these things, so I would like to know the less complex solution, but well supported to be uses with QT Quick.
Thanks in advance
-
Hi. In order to manage v4l video cameras I'm using OpenCV, to manage rtsp video cameras and movies (to read and to create) I'm using ffmpeg.
What do you have to do? My approach is more complicated than use gstreamer but if it is very great and works everywhere. -
I'm so sorry, I was testing some possible workarounds and I forgot this thread.
What I've finally done is using Gstreamer (alone, not QTGstreamer) with qmlglsink element. In my case, I needed some fixes in Gstreamer that are going to be included in version 1.15 but it's working.
About Camera QML element I still don't know if it's supposed to work with analog cameras. I think yes, but I haven't it working yet because I think my driver is a bit buggy, I'll try to fix it.
I have tested using the QML example from Gstreamer, but modifying the pipeline to use my cameras instead of videotestsrc.
This is the original example: https://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/tests/examples/qt/qmlsink?id=b03df1abf828514b0966db058388bf974ed0cf4f
And this is the pipeline I've used:
// Pipeline object GstElement *pipeline = gst_pipeline_new (NULL); // v4l2 src, selecting device /dev/video1 GstElement *src = gst_element_factory_make("v4l2src", NULL); g_object_set (src, "device", "/dev/video1", NULL); // caps selection: format, width, height... GstElement *capsfilter = gst_element_factory_make("capsfilter", NULL); GstCaps *caps = gst_caps_new_simple("video/x-raw", "format", G_TYPE_STRING, "NV12", "width", G_TYPE_INT, 720, "height", G_TYPE_INT, 576, "framerate", GST_TYPE_FRACTION, 5, 1, NULL); g_object_set (capsfilter, "caps", caps, NULL); // convert to GL texture (as qmlglsink expects, also this force processing the video accelerated by the GPU) GstElement *glupload = gst_element_factory_make ("glupload", NULL); // convert from original video format NV12 to format accepted by qmlglsink (RGBA) GstElement *glcolorconvert = gst_element_factory_make ("glcolorconvert", NULL); // QML element sink /* the plugin must be loaded before loading the qml file to register the * GstGLVideoItem qml item */ GstElement *sink = gst_element_factory_make ("qmlglsink", NULL); g_object_set (sink, "sync", false, NULL); // link the elements in the pipeline gst_bin_add_many (GST_BIN (pipeline), src, capsfilter, glupload, glcolorconvert, sink, NULL); gst_element_link_many (src, capsfilter, glupload, glcolorconvert, sink, NULL);
-
Hi @Pablo-J-Rogina, it still remains the question about Camera element from QML. I'd wish to maintain open the post for some days more to see if I solve it and/or anybody can give me some help about it. If not, I will mark as solved in some days.
-
Hi,
Might be silly question but, are you sure your pipeline is running properly ?
-
Hi @SGaist.
Sorry, I didn't see your reply. I've changed my forum settings to receive an email when someone replies to my posts.
I guess that the pipeline is not running properly, but I can't know since I didn't made the pipeline. In my original post, I was using Camera QML element, and I have no control over the pipeline it tries to create.
In my later reply, I show a pipeline made by me that is working. But it's still no possible to use Camera QML element, which is supposed to handle all this automatically.
-
Did you consider using the QtGStreamer module directly ?
-
Do you mean the qmlglsink ? Worth a try yes