Get frames from qml videoOutput and send to TcpSocket.
-
@SGaist we have embedded device which has camera and GUI application and I want to view/control the liveview of that camera from my android/iOS application. So each video frames from my embedded device should transmit to mobile application.
Qmediaplayer is only to play the media source if I'm not wrong.
-
The link I provided shows you how to create a custom GStreamer pipeline where you can try to use a pipe so you can both show the stream on your device and have it streamed on the network.
-
Custom pipeline building in QMediaPlayer came in 5.12.2 like explained in the documentation.
However nothing stops you from using other means to create the pipeline.
-
@SGaist its very hard to understand GStreamer. How can i pass QML camera to Gstreamer? is there any example to stream live camera like sender and receiver? i know this is stupid question, but i'm not getting any input that from where to start.
-
As I already wrote, don't do that in that order.
Stream your camera using a dedicated software that will generate a real video stream.
Even if it's not simple, GStreamer allows you to show the feed locally and send it over the network at the same time.
-
Sample code of what ?
You do not even say what solution you want to implement.
-
@sharath said in Get frames from qml videoOutput and send to TcpSocket.:
we have embedded device which has camera and GUI application and I want to view/control the liveview of that camera from my android/iOS application. So each video frames from my embedded device should transmit to mobile application.
i want to use Gstreamer to stream the video over network(TCP). give some example code so that i can start.
-
Here you have an example using the command line to stream a video feed from a RaspberryPi. The streaming part will work the same on a computer.
-
Hello @SGaist, i just running the following commands to undertand Gstreamer on command line.
To send current screen to network:gst-launch-1.0 -v ximagesrc use-damage=false xname=/usr/lib/torcs/torcs-bin ! videoconvert ! videoscale ! video/x-raw,format=I420,width=800,height=600,framerate=25/1 ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5000
To receive:
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink
here its sharing my current screen using ximagesrc to network.
So how can i send only current camera live view through Gstreamer. whats the command for that? -
Did you even try to search for "GStreamer camera source" ?
-
Thank you @SGaist,
i tried below script on command line
gst-launch-1.0 -v v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=640,height=480' ! x264enc pass=qual quantizer=20 tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=1234
Got below error.
(gst-launch-1.0:4420): GStreamer-WARNING **: 0.10-style raw video caps are being created. Should be video/x-raw,format=(string).. now. WARNING: erroneous pipeline: could not link v4l2src0 to x264enc0
-
@sharath said in Get frames from qml videoOutput and send to TcpSocket.:
WARNING: erroneous pipeline: could not link v4l2src0 to x264enc0
The first hit for that search on DuckDuckGo gives several example pipelines to test with your own setup.
16/22