Print QByteArray to raw hex format



  • I have an application running in Qt with a TCP and a UDP client. I use a C# program as server to retrieve a request from one of the clients and response back with a byte array. After a great deal of pain I finally got the communication of a Qt TCP/UDP client to work with the C#TCP/UDP server.

    But I have trouble printing the response as raw hex data in the client.

    The data I sent from the server back to the client is these 10 bytes:

    @byte[] data = new byte[] { 0x00, 0x01, 0x0A, 0x0B, 0x10, 0x11, 0xA0, 0xB0, 0xAA, 0xBB };@

    In my client I want to print that data in the following format:

    @0x00 0x01 0x0A 0x0B 0x10 0x11 0xA0 0xB0 0xAA 0xBB@

    Or:

    @0x0 0x1 0xA 0xB 0x10 0x11 0xA0 0xB0 0xAA 0xBB@

    The filling zero nibble is for bytes with value <less than 0x10 is not necessary for now.
    So, I got a print method to handle this. This method takes a QByteArray with the data.
    @void TransmissionCommandExecuter::ResolvePackage(const QByteArray * response)
    {
    qDebug() << "Response length: " << response->length();

    for(int i = 0; i < response->size(); i++)
    {
        unsigned int j = response->at(i);
        cout << "0x" << QString::number(j, 16).toStdString() << " ";
    }
    

    }@

    I print the length of the data so know I received the correct number of bytes as the server sent. 10 bytes as the response which was declared in C#.

    When the iterator in the for-loop reaches 6 and it points to the byte with value 0xA0 I don't get 0xA0 as value. Instead the application interprets 0x0A as 0xffffffa0.

    With intelisence in Qt Creator I can see the content of the response variable while debugging:
    @[0] 0 '\0'
    [1] 1
    [2] 10
    [3] 11
    [4] 16
    [5] 17
    [6] -96 / 160
    [7] -80 / 176
    [8] -86 / 170
    [9] -69 / 187@

    You can make out with Windows calculator that -96 is 0xFFFFFFA0. So it is obvious one/two compliment stuff that tricks with me. But I can also see "/ 160" As 160 in decimal is 0xA0 in hexadecimal.

    This bugs me. I know I got the correct data from the server. But I can't print is as I want. Any suggestions?



  • did you try QByteArray::toHex() ?
    @
    qDebug() << response->toHex();
    @
    in your ResolvePackage method.



  • I tried that before I wrote the thread but it didn't work.
    I ran it again and I happens to work.

    It instead it gives me a 20 byte long QByteArray with an element for each nibble. I guess I can concat first to second elements to one byte and print that.

    Any other suggestions?



  • well yeah that is the whole purpose of the toHex method I guess, it just converts every byte in your data to a hex value. I think it is safe that it will always be 2 byte hex values so "5" will be "05" and never "5" I think?

    But if you really need single characters it should also work on a single bytes to keep you from splitting the byte array.

    I didn't test it myself but your first loop should work with using QString::number? maybe use the proper types and why are you using toStdString, from my experience Qt doesn't work well with std::cout because the output will be delayed until you close the application.

    well I would do it like this maybe:
    @
    QByteArray ba("test");
    for (char c : ba) {
    qDebug() << QString("0x%1").arg((int)c, 0, 16);
    }
    @

    output:
    @
    "0x74"
    "0x65"
    "0x73"
    "0x74"
    @

    (that is a c++11 for-each loop in case you are not familiar with that for syntax)



  • No that didn't work either :( Still 0xFFFFFFA0 printing...

    I managed to create a solution based on the .toHex method as you said. So I will go for that. Thanks!



  • where it printing 0xFFFFFFA0? that is no valid hex code for a char, what is the value of the char? it may be because you have invalid character in the data your receive, or those are not ascii character, char can only save ascii of course if you have other binary data or unicode character that don't work obviously because unicode is not re presentable with a char, you need q QChar for that because unicode can be 16 bit and not just 8.. might be some other reason the the QString::number method can't be the problem, but your "data" is not printable as 1 byte hex then...



  • You are casting to int from char, both are signed. If the char has its most significant bit set to 1 then it represents a negative number which is dutifully sign extended to 32 bits when cast to int. So, -96 decimal => 0xA0 sign extended to 0xFFFFFFA0 in this case.

    Cast to unsigned int to avoid this.


Log in to reply
 

Looks like your connection to Qt Forum was lost, please wait while we try to reconnect.