Important: Please read the Qt Code of Conduct - https://forum.qt.io/topic/113070/qt-code-of-conduct
QTcpSocket: Setting LowDelayOption seems to have no effect?
bmarkwart last edited by bmarkwart
I have a Qt GUI application that uses QTcpSocket to send and receive TCP packets to and from a server. So far I've had success making the TCP socket connections (there are 2 separate socket connections because there are 2 different message sets. Same IP address for both but 2 different port numbers) and sending and receiving packets. Most of the messages that my application sends are kicked off via push-button on the GUI's main window (one message is sent periodically using a QTimer that expires every 1667ms).
The server has a FIFO (128 messages deep) and sends a specific message to my application that communicates when the FIFO is 1/2 full, 3/4 full, and full. It's tedious to test this functionality by just mashing the send button on the GUI so I had the idea of loading a .csv file that could be pre-filled (the message has several different configurable parameters) with what I want to send. Each line gets read and turned into a message and sent on the TCP socket.
From my main window I open up a QFileDialog when a push-button on the GUI is clicked. Then when a .csv file is navigated to and selected the function reads the .csv file one line at a time, pulls out all the individual parameters, fills the message with the parameters, and then sends it out to the socket. Each message is 28 bytes. It repeats this until there are no lines left in the .csv file.
What I am noticing on Wireshark is that instead of sending a bunch of individual TCP packets they are all being put together and sent as one large TCP packet.
When I first tested this out I did not know about the LowDelayOption so when I found the information about it in the documentation for QAbstractSocket I thought "Aha! That must be it! The solution to my problem!" but when I added it to my code it did not seem to have any kind of effect at all. It's still being sent as one large TCP packet. For each socket, I am calling setSocketOption to set the LowDelayOption to 1 in the slot function that receives the connected() signal from the socket. I thought maybe the setSocketOption call wasn't working so I checked this by calling socketOption to get the value of the LowDelayOption and it's 1.
Is there something else I need to be doing? Am I doing something wrong?
Thanks for your time and your help. If it matters I am developing this on Windows and I am using Qt 5.9.1