Unsolved Streaming Video Out
-
@raf924 Ok thanks, i will continue to search :)
-
It's very easy but hard too.
I have a Qt video surveillance application. I'm using OpenCV to grab images from cameras, FFMpeg to store video and a own http server Qt based to broadcast the video using h264 or mjpeg.
If you want to use broadcast (one server to more clients) I suggest to use udp.
If you want to send more than one stream I suggest you to use an own protocol similar to avi format (little package with header and data). -
I tried to get QVideoFrame from QAbstractVideoSurface. But I can't get the QVideoFrame buffer to send it to my UdpSocket. I just used QVideoFrame.bits(), but it's seems is empty.
-
@Bouriquo How did you go about getting it? Did you subclass QAbstractVideoSurface like in the link i gave you?
-
@raf924 Yes I was already find this class.
#include "myvideosurface.h"
MyVideoSurface::MyVideoSurface()
{
writer = new QUdpSocket();
}QListQVideoFrame::PixelFormat MyVideoSurface::supportedPixelFormats( QAbstractVideoBuffer::HandleType handleType) const
{
Q_UNUSED(handleType);// Return the formats you will support return QList<QVideoFrame::PixelFormat>() << QVideoFrame::Format_RGB565;
}
bool MyVideoSurface::present(const QVideoFrame &frame)
{
Q_UNUSED(frame);//statusLabel->setText(tr("Now broadcasting datagram %1").arg(messageNo)); QVideoFrame *clone = new QVideoFrame(frame); clone->map(QAbstractVideoBuffer::ReadOnly); qDebug() << clone->isMapped(); //frame.map(QAbstractVideoBuffer::ReadOnly); QByteArray *datagram = new QByteArray((char*)(clone->bits()),clone->mappedBytes()); //QByteArray datagram = "Broadcast message "; //qDebug() << "Write bytes1 : " << frame.mappedBytes() << frame.mapMode(); //qDebug() << "Write bytes2 : " << clone->mappedBytes() << clone->mapMode(); qDebug() << clone->bits(); writer->writeDatagram(datagram->data(), datagram->size(),QHostAddress::Broadcast, 45454); //writer->writeDatagram((char*)clone->bits(),clone->mappedBytes(),QHostAddress::Broadcast, 45454); clone->unmap(); return true;
}
-
Out of curiosity, why not use something that's designed for that task ? GStreamer allows to do everything you need: get the camera stream, add overlay, stream using standard protocol and if you use QtGStreamer you also have a widget or QtQuick element to visualize that.
-
Yes I know. But I just would like to try by myself :D.
-
This post is deleted! -
Is using GStreamer still recommended solution for streaming out through network?
EDIT: relevant -> recommended -
Why wouldn't it be ?
-
-
What exactly do you want to do ?
-
I need a simple way to send frames from camera over network, while at the same time providing real-time 30FPS camera preview in QML widget.
Simple - means as little coding as possible.
The way I'm trying to do it now is to derive C++ class fromQObject
with"mediaObject"
property expose it as a source forVideoOutput
in QML.
The C++ class would catch the frames and save them into buffer (and sent over network).Unfortunately its impossible (afaik) using just QML, beacuse
imageCapture
object inside Camera does not allow to save frame into buffers (only into files). -
Then QtGStreamer is likely what you want.
-
But, would it work if I would create a class with
Q_PROPERTY
mediaObject - that hold QCamera instance?
And pass this class to QML videoOutput ? -
QCamera is a class to get video stream or images from a camera. The QML VideoOutput type is designed to render video onscreen.
-
This post is deleted!