How to Fix Low TCP Server Performance

  • In my project I am continuously capturing and sending images with QTCPSockets. I use TCP over UDP since I don't want images to get lost or changed etc.

    Each frame size is 640x480 and I am sending 24 frames in a second. But when I calculate the kbps instead of (640x480x24)~7372 kbps I get ~2000-3000 kbps . And plus sometimes the arrived frames are completely blank.

    How can I fix this problem?


    #include "tcpsender.h"
    #include "ui_tcpsender.h"
    #include <QtWidgets>
    #include <QtNetwork>
    #include <QtCore>
    #include <QDebug>
    #include <QBuffer>
    #include <QDataStream>
    #define XRES 640
    #define YRES 480
    TCPSender::TCPSender(QWidget *parent) :
        ui(new Ui::TCPSender)
        statusLabel = new QLabel(tr("Ready to send frames on port 6667"));
        startButton = new QPushButton(tr("&Start"));
        auto quitButton = new QPushButton(tr("&Quit"));
        auto buttonBox = new QDialogButtonBox;
        buttonBox->addButton(startButton, QDialogButtonBox::ActionRole);
        buttonBox->addButton(quitButton, QDialogButtonBox::RejectRole);
        socket = new QTcpSocket(this);
        connect(startButton, &QPushButton::clicked, this, &TCPSender::startConnection);
        connect(quitButton, &QPushButton::clicked, this, &TCPSender::close);
        connect(socket, SIGNAL(connected()), SLOT(startSending()));
        connect(&timer, &QTimer::timeout, this, &TCPSender::sendFrame);
        auto mainLayout = new QVBoxLayout;
        setWindowTitle(tr("Broadcast Sender"));
        camera = new Camera("/dev/video0", XRES, YRES);
        time = QTime::currentTime();
        delete ui;
    void TCPSender::startConnection()
        if (socket->state() == QAbstractSocket::UnconnectedState)
            socket->connectToHost(ui->lineEdit->text(), 6667, QIODevice::WriteOnly);
    void TCPSender::startSending()
        qDebug()<<"Timer start";
    void TCPSender::sendFrame()
            auto frame = camera->frame();
            image = new QImage(,XRES,YRES,QImage::Format_RGB888);
            QImage im = image->convertToFormat(QImage::Format_Grayscale8);
            QByteArray ba;
            QBuffer buffer(&ba);
            qDebug()<<"writing socket";
            int speed = time.msecsTo(QTime::currentTime());
            time = QTime::currentTime();
            speed = 1000*300/speed;
            ui->label->setText(QString("%1 kb/s").arg(speed));
            delete image;


    #include "reciever.h"
    #include <QBuffer>
    #include <QTcpSocket>
    #include <QImage>
    #include <QDebug>
    #include <iostream>
    #include <fstream>
    #include <string>
    #include <sstream>
    #include <unistd.h>
    Reciever::Reciever(QObject* parent): QTcpServer(parent)
        connect(this, SIGNAL(newConnection()), this, SLOT(addConnection()));
    void Reciever::addConnection()
        QTcpSocket* connection = nextPendingConnection();
        QBuffer* buffer = new QBuffer(this);
        buffers.insert(connection, buffer);
    void Reciever::receiveImage()
        QTcpSocket* socket = static_cast<QTcpSocket*>(sender());
        QBuffer* buffer = buffers.value(socket);
        //Oku buffera yaz
        qint64 bytes = buffer->write(socket->readAll());
        emit sendBuffer(buffer,bytes);

    Note: I've been working on this project for some time now. At first I used multiple connection server which didn't had any blank frame problem. But it had low performance as well which is why I switched to this one. But for some reason performance didn't increase too much.

  • Unfortunately the answer to your real underlying problem is too complex for a simple answer. I'd suggest a lot of reading about network video streaming. TCP video transmission using uncompressed frame images isn't going to work well.

    Suggested reading:

    JPEG compression
    I/P frame video compression

  • a more direct comment is a question: Where is your Camera class coming from, and what exactly is the behaviour of the frame() method? Does it block? Do you need to check the status of the returned frame?

  • @Kent-Dorfman My camera class is v4l . Since I don't know much about v4l I found it from github. Here is the frame and read_frame methods.

    const RGBImage& Camera::frame(int timeout)
        for (;;) {
            fd_set fds;
            struct timeval tv;
            int r;
            FD_SET(fd, &fds);
            /* Timeout. */
            tv.tv_sec = timeout;
            tv.tv_usec = 0;
            r = select(fd + 1, &fds, NULL, NULL, &tv);
            if (-1 == r) {
                if (EINTR == errno)
                throw runtime_error("select");
            if (0 == r) {
                throw runtime_error(device + ": select timeout");
            if (read_frame()) {
                return rgb_frame;
            /* EAGAIN - continue select loop. */
    bool Camera::read_frame()
        struct v4l2_buffer buf;
        unsigned int i;
        buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
        buf.memory = V4L2_MEMORY_MMAP;
        if (-1 == xioctl(fd, VIDIOC_DQBUF, &buf)) {
            switch (errno) {
                case EAGAIN:
                    return false;
                case EIO:
                    /* Could ignore EIO, see spec. */
                    /* fall through */
                    throw runtime_error("VIDIOC_DQBUF");
        assert(buf.index < n_buffers);
        v4lconvert_yuyv_to_rgb24((unsigned char *) buffers[buf.index].data,
        if (-1 == xioctl(fd, VIDIOC_QBUF, &buf))
            throw runtime_error("VIDIOC_QBUF");
        return true;

    Unfortunately the answer to your real underlying problem is too complex for a simple answer. I'd suggest a lot of reading about network video streaming. TCP video transmission using uncompressed frame images isn't going to work well.

    Also why uncompressed frame images doesnt work well ? I tried to compress using ffmpeg but there was a 2 secs of delay which is something that I don't want. Also I need to be able to access each frame since my friend will process each frame.

    • the price you pay for reliable delivery in TCP is that you cannot guarantee timely delivery over the network.

    • video streaming is almost always done via UDP, using TCP only as a control to describe the content of the UDP channel. See RTP/RTCP

    • key frame (complete image) frames are very expensive. That's why most modern codes use I/P compression: a key frame (usually jpeg derived), and then a series of progressive frames that only contain the deltas (or changes) from the last frame. Some frames can be (B) bidirectional IOW depending upon a frame that occurs AFTER the B frame. Video is almost always queued and not delivered in real-time over IP networks. Some latency is always to be expected. 100-200ms latency is common in local ethernet networks.

    • video almost always is encoded in a YUV colorspace format because it is more efficient than RGB for temporal (time changing) data.

    Finally, what is the default timeout value of the frame method()? if it is too short then that would explain the blank frame images.

  • Lifetime Qt Champion


    To add to the good points of @Kent-Dorfman, there are libraries that are already well established for streaming video on the network. You can do that using GStreamer, ffmpeg, VLC, etc.

    You should consider using a proved technology for such a task. That will also allow you to cooperate more easily with other media players.

  • @SGaist I tried ffmpeg and vlc sadly with those libraries there were 2 seconds of delay. I want stream to be realtime. Plus my friend needs to access each frame in order to process it. With QT we can do such thing thanks to QByteArray and QBuffer.