Streaming Video Out
I found how I can read a movie and put in a widget with QMediaPlayer. I Added to a GraphicsScene to put an overlay over the video.
Now I would like to difuse this video ( with the overlay ) on a broadcast address. But I don't know how i can get the streaming video data to flush it in a socket.
Can you say me which class i can use to do it ?
Hi and welcome to devnet,
Hope it helps
@Bouriquo You can get the data from the file with QFile and its
read()method then use a QTCPServer to send the data and QTCPSocket to receive it.
Thanks you for your reply.
@SGaist yes i know it's not a part of qmultimedia. But if you prefer, I developed several applications, one to take a stream and replay the same stream with QUdpSocket, another one to play a video with QMediaPlayer and another to Play a video and add a Pixmap overlay.
But now, i want to mixed them and obtain a replay of the input stream to the output stream with a pixmap overlay added. And i don't know how I can get the data to stream from the QGraphicsScene ( video + pixmap ).
@raf924 Yes Stream a video I found it, but like as said before, i want to modifiy the frame before send to the network with a pixmap overlay.
@Bouriquo Oh ok. hmm well With The QVideoSource class you can get QVideoFrames of which you can get the bytes then maybe from those you can make it a pixmap then add your overlay ( i have no idea how to do that) then send that through the socket and convert it back to a qvideoframe which you can display on a Qvideosurface. I don't know how to do all that though or if it can be done at all, but where there's a will there's a way right? Anyway this might help : http://doc.qt.io/qt-5/videooverview.html
@raf924 Ok thanks, i will continue to search :)
It's very easy but hard too.
I have a Qt video surveillance application. I'm using OpenCV to grab images from cameras, FFMpeg to store video and a own http server Qt based to broadcast the video using h264 or mjpeg.
If you want to use broadcast (one server to more clients) I suggest to use udp.
If you want to send more than one stream I suggest you to use an own protocol similar to avi format (little package with header and data).
I tried to get QVideoFrame from QAbstractVideoSurface. But I can't get the QVideoFrame buffer to send it to my UdpSocket. I just used QVideoFrame.bits(), but it's seems is empty.
@Bouriquo How did you go about getting it? Did you subclass QAbstractVideoSurface like in the link i gave you?
@raf924 Yes I was already find this class.
writer = new QUdpSocket();
QListQVideoFrame::PixelFormat MyVideoSurface::supportedPixelFormats( QAbstractVideoBuffer::HandleType handleType) const
// Return the formats you will support return QList<QVideoFrame::PixelFormat>() << QVideoFrame::Format_RGB565;
bool MyVideoSurface::present(const QVideoFrame &frame)
//statusLabel->setText(tr("Now broadcasting datagram %1").arg(messageNo)); QVideoFrame *clone = new QVideoFrame(frame); clone->map(QAbstractVideoBuffer::ReadOnly); qDebug() << clone->isMapped(); //frame.map(QAbstractVideoBuffer::ReadOnly); QByteArray *datagram = new QByteArray((char*)(clone->bits()),clone->mappedBytes()); //QByteArray datagram = "Broadcast message "; //qDebug() << "Write bytes1 : " << frame.mappedBytes() << frame.mapMode(); //qDebug() << "Write bytes2 : " << clone->mappedBytes() << clone->mapMode(); qDebug() << clone->bits(); writer->writeDatagram(datagram->data(), datagram->size(),QHostAddress::Broadcast, 45454); //writer->writeDatagram((char*)clone->bits(),clone->mappedBytes(),QHostAddress::Broadcast, 45454); clone->unmap(); return true;
Out of curiosity, why not use something that's designed for that task ? GStreamer allows to do everything you need: get the camera stream, add overlay, stream using standard protocol and if you use QtGStreamer you also have a widget or QtQuick element to visualize that.
Yes I know. But I just would like to try by myself :D.
This post is deleted!
Is using GStreamer still recommended solution for streaming out through network?
EDIT: relevant -> recommended
Why wouldn't it be ?
What exactly do you want to do ?
I need a simple way to send frames from camera over network, while at the same time providing real-time 30FPS camera preview in QML widget.
Simple - means as little coding as possible.
The way I'm trying to do it now is to derive C++ class from
"mediaObject"property expose it as a source for
The C++ class would catch the frames and save them into buffer (and sent over network).
Unfortunately its impossible (afaik) using just QML, beacuse
imageCaptureobject inside Camera does not allow to save frame into buffers (only into files).
Then QtGStreamer is likely what you want.
But, would it work if I would create a class with
Q_PROPERTYmediaObject - that hold QCamera instance?
And pass this class to QML videoOutput ?
QCamera is a class to get video stream or images from a camera. The QML VideoOutput type is designed to render video onscreen.