Get frames from qml videoOutput and send to TcpSocket.
-
wrote on 31 Jan 2020, 14:35 last edited by aha_1980
Hi all,
i'm working on liveview sharing between two devices using TCPsocket. So in qml i have used Camera type and Videooutput.
Camera { id:camera cameraState: Camera.ActiveState deviceId: QtMultimedia.availableCameras[0].deviceId //position: Camera.FrontFace focus { //focusMode: CameraFocus.FocusContinuous focusPointMode: CameraFocus.FocusPointAuto //focusMode: CameraFocus.FocusMacro } } CFilter{ id:filter } VideoOutput { id: videoOutput source: camera //anchors.top: text1.bottom //anchors.bottom: text2.top //anchors.left: parent.left //anchors.right: parent.right anchors.fill: parent //radius:parent.width*0.7 autoOrientation: true fillMode: VideoOutput.Stretch filters: [ filter ] }
CFilter is my c++ class.
class CFRunnable:public QVideoFilterRunnable { public: //explicit CFRunnable(QAbstractVideoFilter *filter); QVideoFrame run(QVideoFrame *input, const QVideoSurfaceFormat &surfaceFormat, RunFlags flags){ if(input->isValid()) { qDebug()<<"frames are coming..."<<input; //qDebug() << "GOT FRAME"<<imageFromVideoFrame(*input); QByteArray arr; QVideoFrame cloneFrame(*input); cloneFrame.map(QAbstractVideoBuffer::ReadOnly); const QImage image(cloneFrame.bits(), cloneFrame.width(), cloneFrame.height(), QVideoFrame::imageFormatFromPixelFormat(cloneFrame .pixelFormat())); qDebug()<<" format...:-"<<QVideoFrame::imageFormatFromPixelFormat(cloneFrame .pixelFormat()); //i'm getting invalid format, how do i convert YUYV format cloneFrame.unmap(); if(count==1 ) { tcpserver.startServer(); } tcpserver.writeBytesToClient(arr); } return *input; } private: QAbstractVideoFilter* m_filter; QVideoFrame frame; TcpServer tcpserver; int count; };
I need to convert QVideoFrame to Qimage and then to Qbytearray and send it to tcpsocket.
Please help me to solve this problem.
Thanks in advance
-
Hi,
What OS are you on ?
-
Then what about using tools already made for that ?
For example use GStreamer to stream your video and then your application just as a visualizer. -
Then what about using tools already made for that ?
For example use GStreamer to stream your video and then your application just as a visualizer. -
Why is it constrained to your application only ?
If you have a recent enough version of Qt, you might be able to use QMediaPlayer.
Otherwise, you can also build the pipeline yourself use the GStreamer API.
-
Why is it constrained to your application only ?
If you have a recent enough version of Qt, you might be able to use QMediaPlayer.
Otherwise, you can also build the pipeline yourself use the GStreamer API.
wrote on 2 Feb 2020, 04:27 last edited by sharath 2 Feb 2020, 04:28@SGaist we have embedded device which has camera and GUI application and I want to view/control the liveview of that camera from my android/iOS application. So each video frames from my embedded device should transmit to mobile application.
Qmediaplayer is only to play the media source if I'm not wrong.
-
The link I provided shows you how to create a custom GStreamer pipeline where you can try to use a pipe so you can both show the stream on your device and have it streamed on the network.
-
The link I provided shows you how to create a custom GStreamer pipeline where you can try to use a pipe so you can both show the stream on your device and have it streamed on the network.
-
The link I provided shows you how to create a custom GStreamer pipeline where you can try to use a pipe so you can both show the stream on your device and have it streamed on the network.
-
Custom pipeline building in QMediaPlayer came in 5.12.2 like explained in the documentation.
However nothing stops you from using other means to create the pipeline.
-
Custom pipeline building in QMediaPlayer came in 5.12.2 like explained in the documentation.
However nothing stops you from using other means to create the pipeline.
wrote on 7 Feb 2020, 04:15 last edited by sharath 2 Jul 2020, 04:17@SGaist its very hard to understand GStreamer. How can i pass QML camera to Gstreamer? is there any example to stream live camera like sender and receiver? i know this is stupid question, but i'm not getting any input that from where to start.
-
As I already wrote, don't do that in that order.
Stream your camera using a dedicated software that will generate a real video stream.
Even if it's not simple, GStreamer allows you to show the feed locally and send it over the network at the same time.
-
As I already wrote, don't do that in that order.
Stream your camera using a dedicated software that will generate a real video stream.
Even if it's not simple, GStreamer allows you to show the feed locally and send it over the network at the same time.
-
Sample code of what ?
You do not even say what solution you want to implement.
-
@SGaist we have embedded device which has camera and GUI application and I want to view/control the liveview of that camera from my android/iOS application. So each video frames from my embedded device should transmit to mobile application.
Qmediaplayer is only to play the media source if I'm not wrong.
wrote on 14 Feb 2020, 04:31 last edited by@sharath said in Get frames from qml videoOutput and send to TcpSocket.:
we have embedded device which has camera and GUI application and I want to view/control the liveview of that camera from my android/iOS application. So each video frames from my embedded device should transmit to mobile application.
i want to use Gstreamer to stream the video over network(TCP). give some example code so that i can start.
-
Here you have an example using the command line to stream a video feed from a RaspberryPi. The streaming part will work the same on a computer.
-
Here you have an example using the command line to stream a video feed from a RaspberryPi. The streaming part will work the same on a computer.
-
Here you have an example using the command line to stream a video feed from a RaspberryPi. The streaming part will work the same on a computer.
wrote on 17 Feb 2020, 07:25 last edited byHello @SGaist, i just running the following commands to undertand Gstreamer on command line.
To send current screen to network:gst-launch-1.0 -v ximagesrc use-damage=false xname=/usr/lib/torcs/torcs-bin ! videoconvert ! videoscale ! video/x-raw,format=I420,width=800,height=600,framerate=25/1 ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5000
To receive:
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink
here its sharing my current screen using ximagesrc to network.
So how can i send only current camera live view through Gstreamer. whats the command for that? -
Did you even try to search for "GStreamer camera source" ?
1/22