Important: Please read the Qt Code of Conduct - https://forum.qt.io/topic/113070/qt-code-of-conduct
Packet rate using QTcpSocket
I'm transferring data from a measurement server (Raspberry PI 3) to a client (Windows 7) using a QTcpSocket.
The data is a few bytes approximately every 30 ms. Each time I call write(data) on the QTcpSocket. The data reaches the client with an increasing delay. Wireshark shows me approximately one package each 200 ms.
My object and conditions:
A camera records a frame each 30 ms and the server has to send position information for each frame. Both transfer goes through one common GBit network interface (and a switch). The camera is consuming most of the bandwith, but the VPN link to the server still shows no delay.
Do you have a hint, how to to transfer the measurement data in real time (ideally once each frame)?
m.sue last edited by
increasing delay means that something needs more time every call. You should search for a structure, string etc. that gets bigger during the run. Could be a memory leak. I e.g. once had a string that got appended to every call which needed more and more time every call.
that is what I wanted to express: The increasing delay means, that there is somewere a sending queue, which is filled faster (30 ms) than data is removed (200 ms).
This assumed queue is not part of my written code. A poor solution would be to reduce the filling rate by accumulating multiple data. In that case the delay is still 200 ms and I don't know how this behaves under other conditions.
A good solution would be to know where this delay comes from and to reduce it.
I just learned about the Nagle's algorithm and the QAbstractSocket::LowDelayOption.
Now it works fine. Thanks.