Get frames from qml videoOutput and send to TcpSocket.
-
Hi,
What OS are you on ?
-
Then what about using tools already made for that ?
For example use GStreamer to stream your video and then your application just as a visualizer. -
Then what about using tools already made for that ?
For example use GStreamer to stream your video and then your application just as a visualizer. -
Why is it constrained to your application only ?
If you have a recent enough version of Qt, you might be able to use QMediaPlayer.
Otherwise, you can also build the pipeline yourself use the GStreamer API.
-
Why is it constrained to your application only ?
If you have a recent enough version of Qt, you might be able to use QMediaPlayer.
Otherwise, you can also build the pipeline yourself use the GStreamer API.
wrote on 2 Feb 2020, 04:27 last edited by sharath 2 Feb 2020, 04:28@SGaist we have embedded device which has camera and GUI application and I want to view/control the liveview of that camera from my android/iOS application. So each video frames from my embedded device should transmit to mobile application.
Qmediaplayer is only to play the media source if I'm not wrong.
-
The link I provided shows you how to create a custom GStreamer pipeline where you can try to use a pipe so you can both show the stream on your device and have it streamed on the network.
-
The link I provided shows you how to create a custom GStreamer pipeline where you can try to use a pipe so you can both show the stream on your device and have it streamed on the network.
-
The link I provided shows you how to create a custom GStreamer pipeline where you can try to use a pipe so you can both show the stream on your device and have it streamed on the network.
-
Custom pipeline building in QMediaPlayer came in 5.12.2 like explained in the documentation.
However nothing stops you from using other means to create the pipeline.
-
Custom pipeline building in QMediaPlayer came in 5.12.2 like explained in the documentation.
However nothing stops you from using other means to create the pipeline.
wrote on 7 Feb 2020, 04:15 last edited by sharath 2 Jul 2020, 04:17@SGaist its very hard to understand GStreamer. How can i pass QML camera to Gstreamer? is there any example to stream live camera like sender and receiver? i know this is stupid question, but i'm not getting any input that from where to start.
-
As I already wrote, don't do that in that order.
Stream your camera using a dedicated software that will generate a real video stream.
Even if it's not simple, GStreamer allows you to show the feed locally and send it over the network at the same time.
-
As I already wrote, don't do that in that order.
Stream your camera using a dedicated software that will generate a real video stream.
Even if it's not simple, GStreamer allows you to show the feed locally and send it over the network at the same time.
-
Sample code of what ?
You do not even say what solution you want to implement.
-
@SGaist we have embedded device which has camera and GUI application and I want to view/control the liveview of that camera from my android/iOS application. So each video frames from my embedded device should transmit to mobile application.
Qmediaplayer is only to play the media source if I'm not wrong.
wrote on 14 Feb 2020, 04:31 last edited by@sharath said in Get frames from qml videoOutput and send to TcpSocket.:
we have embedded device which has camera and GUI application and I want to view/control the liveview of that camera from my android/iOS application. So each video frames from my embedded device should transmit to mobile application.
i want to use Gstreamer to stream the video over network(TCP). give some example code so that i can start.
-
Here you have an example using the command line to stream a video feed from a RaspberryPi. The streaming part will work the same on a computer.
-
Here you have an example using the command line to stream a video feed from a RaspberryPi. The streaming part will work the same on a computer.
-
Here you have an example using the command line to stream a video feed from a RaspberryPi. The streaming part will work the same on a computer.
wrote on 17 Feb 2020, 07:25 last edited byHello @SGaist, i just running the following commands to undertand Gstreamer on command line.
To send current screen to network:gst-launch-1.0 -v ximagesrc use-damage=false xname=/usr/lib/torcs/torcs-bin ! videoconvert ! videoscale ! video/x-raw,format=I420,width=800,height=600,framerate=25/1 ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5000
To receive:
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink
here its sharing my current screen using ximagesrc to network.
So how can i send only current camera live view through Gstreamer. whats the command for that? -
Did you even try to search for "GStreamer camera source" ?
-
wrote on 18 Feb 2020, 06:54 last edited by sharath
Thank you @SGaist,
i tried below script on command line
gst-launch-1.0 -v v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=640,height=480' ! x264enc pass=qual quantizer=20 tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=1234
Got below error.
(gst-launch-1.0:4420): GStreamer-WARNING **: 0.10-style raw video caps are being created. Should be video/x-raw,format=(string).. now. WARNING: erroneous pipeline: could not link v4l2src0 to x264enc0
11/22