Problems with Open-Source Downloads read https://www.qt.io/blog/problem-with-open-source-downloads and https://forum.qt.io/post/638946
Get frames from qml videoOutput and send to TcpSocket.
-
Hi all,
i'm working on liveview sharing between two devices using TCPsocket. So in qml i have used Camera type and Videooutput.
Camera { id:camera cameraState: Camera.ActiveState deviceId: QtMultimedia.availableCameras[0].deviceId //position: Camera.FrontFace focus { //focusMode: CameraFocus.FocusContinuous focusPointMode: CameraFocus.FocusPointAuto //focusMode: CameraFocus.FocusMacro } } CFilter{ id:filter } VideoOutput { id: videoOutput source: camera //anchors.top: text1.bottom //anchors.bottom: text2.top //anchors.left: parent.left //anchors.right: parent.right anchors.fill: parent //radius:parent.width*0.7 autoOrientation: true fillMode: VideoOutput.Stretch filters: [ filter ] }
CFilter is my c++ class.
class CFRunnable:public QVideoFilterRunnable { public: //explicit CFRunnable(QAbstractVideoFilter *filter); QVideoFrame run(QVideoFrame *input, const QVideoSurfaceFormat &surfaceFormat, RunFlags flags){ if(input->isValid()) { qDebug()<<"frames are coming..."<<input; //qDebug() << "GOT FRAME"<<imageFromVideoFrame(*input); QByteArray arr; QVideoFrame cloneFrame(*input); cloneFrame.map(QAbstractVideoBuffer::ReadOnly); const QImage image(cloneFrame.bits(), cloneFrame.width(), cloneFrame.height(), QVideoFrame::imageFormatFromPixelFormat(cloneFrame .pixelFormat())); qDebug()<<" format...:-"<<QVideoFrame::imageFormatFromPixelFormat(cloneFrame .pixelFormat()); //i'm getting invalid format, how do i convert YUYV format cloneFrame.unmap(); if(count==1 ) { tcpserver.startServer(); } tcpserver.writeBytesToClient(arr); } return *input; } private: QAbstractVideoFilter* m_filter; QVideoFrame frame; TcpServer tcpserver; int count; };
I need to convert QVideoFrame to Qimage and then to Qbytearray and send it to tcpsocket.
Please help me to solve this problem.
Thanks in advance
-
Hi,
What OS are you on ?
-
Hi @SGaist,
Thanks for your reply.
ubuntu 18.04.
-
Then what about using tools already made for that ?
For example use GStreamer to stream your video and then your application just as a visualizer.
-
@SGaist I don't have knowledge on gstreamer. How can I use it in Qt? I have to make it possible in my application.
-
Why is it constrained to your application only ?
If you have a recent enough version of Qt, you might be able to use QMediaPlayer.
Otherwise, you can also build the pipeline yourself use the GStreamer API.
-
@SGaist we have embedded device which has camera and GUI application and I want to view/control the liveview of that camera from my android/iOS application. So each video frames from my embedded device should transmit to mobile application.
Qmediaplayer is only to play the media source if I'm not wrong.
-
The link I provided shows you how to create a custom GStreamer pipeline where you can try to use a pipe so you can both show the stream on your device and have it streamed on the network.
-
@SGaist thank you, I will look into it.
-
@SGaist gst-pipeline supports only after Qt5.12?
-
Custom pipeline building in QMediaPlayer came in 5.12.2 like explained in the documentation.
However nothing stops you from using other means to create the pipeline.
-
@SGaist its very hard to understand GStreamer. How can i pass QML camera to Gstreamer? is there any example to stream live camera like sender and receiver? i know this is stupid question, but i'm not getting any input that from where to start.
-
As I already wrote, don't do that in that order.
Stream your camera using a dedicated software that will generate a real video stream.
Even if it's not simple, GStreamer allows you to show the feed locally and send it over the network at the same time.
-
@SGaist If you provide some sample code that would really helpful.
-
Sample code of what ?
You do not even say what solution you want to implement.
-
@sharath said in Get frames from qml videoOutput and send to TcpSocket.:
we have embedded device which has camera and GUI application and I want to view/control the liveview of that camera from my android/iOS application. So each video frames from my embedded device should transmit to mobile application.
i want to use Gstreamer to stream the video over network(TCP). give some example code so that i can start.
-
Here you have an example using the command line to stream a video feed from a RaspberryPi. The streaming part will work the same on a computer.
-
@SGaist Thank you. i will go through it
-
Hello @SGaist, i just running the following commands to undertand Gstreamer on command line.
To send current screen to network:gst-launch-1.0 -v ximagesrc use-damage=false xname=/usr/lib/torcs/torcs-bin ! videoconvert ! videoscale ! video/x-raw,format=I420,width=800,height=600,framerate=25/1 ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5000
To receive:
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink
here its sharing my current screen using ximagesrc to network.
So how can i send only current camera live view through Gstreamer. whats the command for that?
-
Did you even try to search for "GStreamer camera source" ?
-
Thank you @SGaist,
i tried below script on command line
gst-launch-1.0 -v v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=640,height=480' ! x264enc pass=qual quantizer=20 tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=1234
Got below error.
(gst-launch-1.0:4420): GStreamer-WARNING **: 0.10-style raw video caps are being created. Should be video/x-raw,format=(string).. now. WARNING: erroneous pipeline: could not link v4l2src0 to x264enc0
-
@sharath said in Get frames from qml videoOutput and send to TcpSocket.:
WARNING: erroneous pipeline: could not link v4l2src0 to x264enc0
The first hit for that search on DuckDuckGo gives several example pipelines to test with your own setup.