Get frames from qml videoOutput and send to TcpSocket.
-
Hi all,
i'm working on liveview sharing between two devices using TCPsocket. So in qml i have used Camera type and Videooutput.
Camera { id:camera cameraState: Camera.ActiveState deviceId: QtMultimedia.availableCameras[0].deviceId //position: Camera.FrontFace focus { //focusMode: CameraFocus.FocusContinuous focusPointMode: CameraFocus.FocusPointAuto //focusMode: CameraFocus.FocusMacro } } CFilter{ id:filter } VideoOutput { id: videoOutput source: camera //anchors.top: text1.bottom //anchors.bottom: text2.top //anchors.left: parent.left //anchors.right: parent.right anchors.fill: parent //radius:parent.width*0.7 autoOrientation: true fillMode: VideoOutput.Stretch filters: [ filter ] }
CFilter is my c++ class.
class CFRunnable:public QVideoFilterRunnable { public: //explicit CFRunnable(QAbstractVideoFilter *filter); QVideoFrame run(QVideoFrame *input, const QVideoSurfaceFormat &surfaceFormat, RunFlags flags){ if(input->isValid()) { qDebug()<<"frames are coming..."<<input; //qDebug() << "GOT FRAME"<<imageFromVideoFrame(*input); QByteArray arr; QVideoFrame cloneFrame(*input); cloneFrame.map(QAbstractVideoBuffer::ReadOnly); const QImage image(cloneFrame.bits(), cloneFrame.width(), cloneFrame.height(), QVideoFrame::imageFormatFromPixelFormat(cloneFrame .pixelFormat())); qDebug()<<" format...:-"<<QVideoFrame::imageFormatFromPixelFormat(cloneFrame .pixelFormat()); //i'm getting invalid format, how do i convert YUYV format cloneFrame.unmap(); if(count==1 ) { tcpserver.startServer(); } tcpserver.writeBytesToClient(arr); } return *input; } private: QAbstractVideoFilter* m_filter; QVideoFrame frame; TcpServer tcpserver; int count; };
I need to convert QVideoFrame to Qimage and then to Qbytearray and send it to tcpsocket.
Please help me to solve this problem.
Thanks in advance
-
Hi,
What OS are you on ?
-
Then what about using tools already made for that ?
For example use GStreamer to stream your video and then your application just as a visualizer. -
Why is it constrained to your application only ?
If you have a recent enough version of Qt, you might be able to use QMediaPlayer.
Otherwise, you can also build the pipeline yourself use the GStreamer API.
-
@SGaist we have embedded device which has camera and GUI application and I want to view/control the liveview of that camera from my android/iOS application. So each video frames from my embedded device should transmit to mobile application.
Qmediaplayer is only to play the media source if I'm not wrong.
-
The link I provided shows you how to create a custom GStreamer pipeline where you can try to use a pipe so you can both show the stream on your device and have it streamed on the network.
-
Custom pipeline building in QMediaPlayer came in 5.12.2 like explained in the documentation.
However nothing stops you from using other means to create the pipeline.
-
@SGaist its very hard to understand GStreamer. How can i pass QML camera to Gstreamer? is there any example to stream live camera like sender and receiver? i know this is stupid question, but i'm not getting any input that from where to start.
-
As I already wrote, don't do that in that order.
Stream your camera using a dedicated software that will generate a real video stream.
Even if it's not simple, GStreamer allows you to show the feed locally and send it over the network at the same time.
-
Sample code of what ?
You do not even say what solution you want to implement.
-
@sharath said in Get frames from qml videoOutput and send to TcpSocket.:
we have embedded device which has camera and GUI application and I want to view/control the liveview of that camera from my android/iOS application. So each video frames from my embedded device should transmit to mobile application.
i want to use Gstreamer to stream the video over network(TCP). give some example code so that i can start.
-
Hello @SGaist, i just running the following commands to undertand Gstreamer on command line.
To send current screen to network:gst-launch-1.0 -v ximagesrc use-damage=false xname=/usr/lib/torcs/torcs-bin ! videoconvert ! videoscale ! video/x-raw,format=I420,width=800,height=600,framerate=25/1 ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5000
To receive:
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink
here its sharing my current screen using ximagesrc to network.
So how can i send only current camera live view through Gstreamer. whats the command for that? -
Did you even try to search for "GStreamer camera source" ?