TCP SOCKET
-
I am basically following the code in the example fortuneserver and fortuneclient that comes with Qt .
On the server side i have the following :
@
QByteArray block ;
QDataStream out(&block, QIODevice::WriteOnly);
out.setVersion(QDataStream::Qt_5_0);out << (quint16)0;
out << dataStr;
out.device()->seek(0);
int temp = (block.size() - sizeof(quint16));
out << (quint16)(block.size() - sizeof(quint16));connection->write(block);
connection->waitForBytesWritten();
@Now , i have several client connections in a list . So I loop through the list and for each connection , i send the QString in dataStr . Now if I have 2 clients connected to the server , the code appeared to be working great . When I added a 3rd client , sometimes one of the clients doesn't receive the correct block size .
On my client I have :
@
QDataStream in(this->sock);
in.setVersion(QDataStream::Qt_5_0);if (blockSize == 0) {
if (this->sock->bytesAvailable() < (int)sizeof(quint16))
return;in >> blockSize;
}if (this->sock->bytesAvailable() < blockSize)
return;QString data(blockSize);
in >> data;
@This code is identical to the sample fortuneclient . On my server , i checked that the data going out looks good so doesn't appear to be problem on server code .Exact same data is sent to all 3 clients . But for some reason , sometimes one of the clients is receiving bad data . The blocksize received is incorrect .
I just tried changing the code by replacing quint16 with quint64 and now i get very large number for incoming blocksize ( like in the millions where should be like 76 ) for one of the clients ( the rest of the clients are ok ) .
For example , 2 clients receive
bytes available : 84
blocksize : 76But the 3rd client receives
bytes available : 86
blocksize : 13510798882111488???
The outgoing data from server shows as still being ok . What is going on ? Does this have to do with my development machine or something ?
What is the possible cause of this and how would i go about debugging it ?
-
Where is your actual QTcpSocket setup code?, I know its probably simple but post that as well let us see how its setup...
-
Ok, I'll show too my code, it was based also on a Qt example.
SSLServer is just a class who is inherited of QTcpSocketServer.
When the client is connected, the QTcpSocket is gived to the SSLServerConnection who creates the SSLSocket.@void SSLServer::incomingConnection(int socket)
{
SSLServerConnection* sslc = new SSLServerConnection(&serverData, socket,serverLOG ,this);
connect(this,SIGNAL(sslserverClosing()),sslc,SLOT(connectionClosing()));
if(! sslc->isReady()){
sslc->connectionClosing();
QString what("Error while trying to connect to the DB");
LOGInfo newlog(who,what, LOGInfo::LOGERROR);
emit sslLOG(newlog);
}
}@@class SSLServerConnection : public QSslSocket
{
Q_OBJECT
...
SSLServerConnection();
}@
In the constructor, I connect the readyRead() signal to the slot readData from my class.@SSLServerConnection::SSLServerConnection(...){
connect(this, SIGNAL(readyRead()), SLOT(readData()));
}@And the readData()
@union readincomingsize{
char arraysize[8];
quint64 size;
};
void SSLServerConnection::readData()
{quint64 available = bytesAvailable();
if(!buffer && available >=8){
readincomingsize rr;
char packSize[8];
int readedbyte = read(rr.arraysize,sizeof(quint64));
packetSize = rr.size;
if(packetSize > 0xFFFFFFF){
qDebug("ERRRRRRRRRRRRRRRRRORRRR : Read packet size %lld vs %lld && %d", offset, packetSize,readedbyte);
return;
}qDebug("Read packet size %lld vs %lld && %d", offset, packetSize,readedbyte);
buffer = new char[packetSize];
offset = 0;}
if(bytesAvailable() < packetSize)
return;
....
}
@
So, the problem is the same as great88. Sometimes, my data read for packetSize is not what is send by my client.For debugging purpose, I've writen a qDebug Packet Size on my client side before sending data.
@void SSLClient::sendData(char* data,unsigned int count,quint64 size, SSLServerCommand cmd){quint64 totalSize = size + sizeof(SSLServerCommand) + sizeof(unsigned int);
write(reinterpret_cast<char*>(&totalSize),sizeof(quint64));
write(reinterpret_cast<char*>(&cmd),sizeof(SSLServerCommand));
write(reinterpret_cast<char*>(&count),sizeof(unsigned int));
if(count)
write(data,size);
flush();
qDebug() << "From sendData, Total Packet Size" << totalSize;}@
...
From sendData, Total Packet Size 8
From sendData, Total Packet Size 20
From sendData, Total Packet Size ...I've never read something wrong on this side
On the server side:
Read packet size 0 vs 8 && 8
Read packet size 0 vs 20 && 8
Read packet size 0 vs 20 && 8
ERRRRRRRRRRRRRRRRRORRRR : Read packet size 0 vs 5891597913088520759 && 8
Read packet size 0 vs 20 && 8Paolo
-
If I am reading this code correctly then packetSize is a char array [8]...i.e. a pointer.
Then we are doing a test:
@if (packSize > 0xFFFFFFF)@
which looks wrong to me... I guess the pointer could start uninitialised (i.e address = 0) but then if/when initialised it could be ANY memory address assigned to it.... therefore larger then 0xFFFFFFF.
Do I read that correctly?
Here is the snipet of code I was looking at:
@void SSLServerConnection::readData()
{quint64 available = bytesAvailable();
if(!buffer && available >=8){
readincomingsize rr;
char packSize[8];
int readedbyte = read(rr.arraysize,sizeof(quint64));
packetSize = rr.size;
if(packetSize > 0xFFFFFFF){
qDebug("ERRRRRRRRRRRRRRRRRORRRR : Read packet size %lld vs %lld && %d", offset, packetSize,readedbyte);
return;
}@me not being familiar with this code.... why is packetSize a char array?
-
I forgot to remove packSize[8]; it was for debugging and test purpose in order to understand why sometime, I have got a wrong packetSize value.
packSize >< packetSize. It is a quint64 variable defined in my class as a variable member. Sorry to forget to show it.
I declare a char[8] variable to read the size of my packet (quint64, 8 bytes)
thanks to the union struct
@readincomingsize rr;@
after,I read the 8bytes into it, and get from union.size. the packet size in quint64 value. -
[quote author="code_fodder" date="1371742892"]Where is your actual QTcpSocket setup code?, I know its probably simple but post that as well let us see how its setup...[/quote]
OK , so in my server ready_read , when a player joins the game :
@
void MultiClientServer::ready_read()
{
QTcpSocket *client = qobject_cast<QTcpSocket *>(sender());listOfPlayers.append(new Player("incoming string",client));
// i passed the client socket to the Player constructor and added Player to list of Players
}
@
The Player constructor :
@
QTcpSocket *socket;Player::Player(QString message,QTcpSocket *socket)
{
this->socket = socket;
}QTcpSocket * Player::getSocket()
{
return socket;
}
@Then when I want to send data to all Players in the list :
@
foreach(Player *p,listOfPlayers){QByteArray block ;
QDataStream out(&block,QIODevice::WriteOnly);
out.setVersion(QDataStream::Qt_5_0);out << (quint64)0;
out << dataStr;
out.device()->seek(0);
out << (quint64)(block.size() - sizeof(quint64));p->getSocket()->write(block);
bool success = p->getSocket()->waitForBytesWritten();}
@ -
[quote author="ChrisW67" date="1371767757"]How do you read the remainder of the packet? Reading more (or fewer) bytes than you should will leave you out-of-sync for the subsequent packet.[/quote]
Hmm , well currently I have QString data(blockSize);
So should be reading exactly blockSize amount of data . But maybe QString isn't interpreting blockSize correctly ; I will check on that .
@
QDataStream in(this->sock);
in.setVersion(QDataStream::Qt_5_0);if (blockSize == 0) { if (this->sock->bytesAvailable() < (int)sizeof(quint16)) return; in >> blockSize; } if (this->sock->bytesAvailable() < blockSize) return; QString data(blockSize); in >> data;
@
-
waitForBytesWritten() waits until the bytes are written to the device or a timeout occurs, i.e. it makes no guarantees that all bytes have been sent (although typically this is true) or received. Bytes are sent in exactly the order they were added to the TCP socket.
No. One write != one read. Written blocks may be split or coalesced by the network stack in order to make the journey.
-
[quote author="ChrisW67" date="1371786433"]
No. One write != one read. Written blocks may be split or coalesced by the network stack in order to make the journey. [/quote]
OK , that's what I thought .
I think the correct way to do this on client side is this way then
(instead of in >> data ) :QByteArray ba = socket->read(blockSize);
QString data = ba;Problem is , conversion from QByteArray to QString is not working for me . data is empty ! How do I convert from ba to a QString ?
-
OK . SOLVED .
the correct client side code is :
@
QDataStream in(this->sock);
in.setVersion(QDataStream::Qt_5_0);if (blockSize == 0) { if (this->sock->bytesAvailable() < (int)sizeof(quint16)) return; in >> blockSize; }
if (this->sock->bytesAvailable() < blockSize)
return;QByteArray ba = this->sock->read(blockSize);
QDataStream in2(ba);
in2.setVersion(QDataStream::Qt_5_0);QString data;
in2 >> data;
@So basically , the blocksize is based on a ByteArray in the server . So to read it correctly on client side , use
QByteArray ba = this->sock->read(blockSize);
Since I need a QString in my code , i convert the bytearray using
a QDataStream . -
Regarding QString, you can instantiate QString directly from QByteArray... the two are quite compatible. I am not on my code-machine so I may get the syntax little wrong but something like this will work:
@
QByteArray ba = this->sock->read(blockSize);
QString qs(ba);
// or you can use one of the following:
qs.append(ba);
qs = ba;
@Note that QByteArray is auto-converted into Uni code when assigning it to a QString because QString IS unicode.
-
[quote author="code_fodder" date="1371798513"]Regarding QString, you can instantiate QString directly from QByteArray... the two are quite compatible. I am not on my code-machine so I may get the syntax little wrong but something like this will work:
@
QByteArray ba = this->sock->read(blockSize);
QString qs(ba);
// or you can use one of the following:
qs.append(ba);
qs = ba;
@[/quote]
Yeah , i thought it should be simple like that but it doesn't work ; the QString ends up empty . On the server side the QDataStream seems to insert a whole bunch of '\0' into the ByteArray . So when it comes back out on the client QString doesn't know how to convert it directly .
None of the following works in this case :
@
QString qs(ba);
qs.append(ba);
qs = ba;
@ -
Really??,....I believe you, I am just surprised :o
I am out of office until Monday :( but I will look at my code, because I do this sort of thing alot... Ill get back to you on that.
-
[quote author="code_fodder" date="1371835819"]Really??,....I believe you, I am just surprised :o
Ill get back to you on that.[/quote]Yeah , so if my QString hello is "Hello" , in the the following code , the variable block contains
'\0','\0','\0','\0','\0','\0','\0','&','\0','H','\0','e','\0','l','\0','l','\0','o'
after hello is sent to out <<@
QByteArray block ;
QDataStream out(&block,QIODevice::WriteOnly);
out.setVersion(QDataStream::Qt_5_0);out << (quint32)0;
out << hello;
out.device()->seek((quint32)0);out << (quint32)(block.size() - sizeof(quint32));
@So on the client side QString doesn't know how to translate
'\0','\0','\0','\0','\0','\0','\0','&','\0','H','\0','e','\0','l','\0','l','\0','o'. Instead of using QString on server side i tried putting a bytearray to out << but still same problem with multiple '\0' being prefixed .
-
Ah ok, I see what you are doing now...
So when you setup your QString what is really happening is that you are creating a "unicode" string (i.e. all values are now 16-bit and not 8-bit).
So the character 't' effectivly becomes '\0', 't'
'\0' --> '\0','\0', etc...Well, that is how you would view it through "8-bit eyes".
I have the feeling you want to take an 8-bit QByteArray and send it as QString (unicode) and then at the other end take the unicde and turn it back to QByteArray... is that about right?
This is how I turn a QString (unicode) into a QByteArray (8bit):
@
uint16_t* pUtf16 = (uint16_t *) (uncompressedMsg.data());// Now move to the source string while (*pUtf16++){}; _source = QString::fromUtf16(pUtf16).toLatin1();
@
This code takes a uint16 pointer and points it to the data area that we are interested in (this may be your QString.data()). Then it moves past any null characters (the while loop), then comes the funcky line of code that was a real pain to figure out. It creates a QString from utf16 (unicode) and then calls the "toLatin1()" function to convert it back to 8-bit QByteArray.
Is that what you are trying to do?....or maybe the other way around (8bit to 16-bit unicode)?
You have to remember that QString IS unicode and once you make it QString you have to be careful how you use it.
Does that make any sense?
If you can avoid QString for conversion or data carrying then I would and just stick to QByteArray where possible :)
Also by using QString you may be doubling your data size and not knowing about it :o