listing values in QByte
-
Works fine for me
#include <QCoreApplication> int main(int argc, char* argv[]) { QCoreApplication app(argc, argv); int tmp = 42; qDebug() << Qt::bin << tmp; return 0; }
is there anyway to do it with a QByteArray? it does not give me the output i expect
void MainWindow::writeSerial() { QByteArray message; message.append(tester) ; qDebug() << "binary" << Qt::bin << tester ; qDebug() << "supposed to be binary" << Qt::bin << message; if (m_serial->open(QIODevice::ReadWrite)) { m_serial->write(message); m_serial->waitForBytesWritten(-1); m_serial->close(); } }
output:
binary 10101010
supposed to be binary "\xAA"the problem is that if i send 0xAA through the port, it fails to trigger a green LED on my MCU, it responds to 0xFF by toggling a green LED , but the same should happen with 0xAA , with 0xFF the LED functions as expected, but 0xAA has no effect.
both 0xFF and 0xAA succeed in toggling the green LED if used with a terminal like Hterm or Teraterm.
Thats why i want to see whats inside the QByteArray after it has the message inside
-
is there anyway to do it with a QByteArray? it does not give me the output i expect
void MainWindow::writeSerial() { QByteArray message; message.append(tester) ; qDebug() << "binary" << Qt::bin << tester ; qDebug() << "supposed to be binary" << Qt::bin << message; if (m_serial->open(QIODevice::ReadWrite)) { m_serial->write(message); m_serial->waitForBytesWritten(-1); m_serial->close(); } }
output:
binary 10101010
supposed to be binary "\xAA"the problem is that if i send 0xAA through the port, it fails to trigger a green LED on my MCU, it responds to 0xFF by toggling a green LED , but the same should happen with 0xAA , with 0xFF the LED functions as expected, but 0xAA has no effect.
both 0xFF and 0xAA succeed in toggling the green LED if used with a terminal like Hterm or Teraterm.
Thats why i want to see whats inside the QByteArray after it has the message inside
@agmar said in listing values in QByte:
output:
binary 10101010
supposed to be binary "\xAA"10101010
is binary, just as you asked for!\xAA
is hex, not binary. So did you try, guess what,Qt::hex
? All documented in https://doc.qt.io/qt-6/qt.html. -
@agmar said in listing values in QByte:
output:
binary 10101010
supposed to be binary "\xAA"10101010
is binary, just as you asked for!\xAA
is hex, not binary. So did you try, guess what,Qt::hex
? All documented in https://doc.qt.io/qt-6/qt.html.I set the value of ''tester'' in hex to begin with
, so i do not need to see the hex value inside the QByteArray, i need the binary value of the message once it has the "tester" value appended.
There seems to be something fishy going on with that array because of the following output:
binary 1010101
supposed to be binary "U"when i change the value of tester to 0x55, which should have the opposite bits of 0xAA , why is there an U in there and why is the first 0 of the binary not there?
put simply, i wish to see the binary value of the QByteArray in binary once it has the unsigned char tester appended...
-
I set the value of ''tester'' in hex to begin with
, so i do not need to see the hex value inside the QByteArray, i need the binary value of the message once it has the "tester" value appended.
There seems to be something fishy going on with that array because of the following output:
binary 1010101
supposed to be binary "U"when i change the value of tester to 0x55, which should have the opposite bits of 0xAA , why is there an U in there and why is the first 0 of the binary not there?
put simply, i wish to see the binary value of the QByteArray in binary once it has the unsigned char tester appended...
@agmar
I don't know what you are asking/thinking, and I don't think anyone else will.binary 1010101
supposed to be binary "U"
Binary is digits in the range 0-1. Just as hex is digits in the range 0-F. There is no such thing as
binary "U"
. "why is the first 0 of the binary not there?" what "first 0"? Do you write, say, the decimal number99
as099
? No, why would you show leading zeroes?There is nothing "fishy" going on.
Oh, 0x55 might be the value for letter
U
?qDebug()
puts interpretation on how to show the characters in aQByteArray
. If you want the content of one shown as hex digits why not use QByteArray QByteArray::toHex(char separator = '\0') const, maybe that is what you are looking for? -
I set the value of ''tester'' in hex to begin with
, so i do not need to see the hex value inside the QByteArray, i need the binary value of the message once it has the "tester" value appended.
There seems to be something fishy going on with that array because of the following output:
binary 1010101
supposed to be binary "U"when i change the value of tester to 0x55, which should have the opposite bits of 0xAA , why is there an U in there and why is the first 0 of the binary not there?
put simply, i wish to see the binary value of the QByteArray in binary once it has the unsigned char tester appended...
@agmar said in listing values in QByte:
when i change the value of tester to 0x55, which should have the opposite bits of 0xAA , why is there an U in there
Because 0x55 is ASCII representation for character U (https://www.asciitable.com).
Please learn how computers represent data."i wish to see the binary value of the QByteArray in binary once it has the unsigned char tester appended" - if you want to see binary (1010...) then print binary as you already do with qDebug() << "binary" << Qt::bin << tester ;
-
is there anyway to do it with a QByteArray? it does not give me the output i expect
void MainWindow::writeSerial() { QByteArray message; message.append(tester) ; qDebug() << "binary" << Qt::bin << tester ; qDebug() << "supposed to be binary" << Qt::bin << message; if (m_serial->open(QIODevice::ReadWrite)) { m_serial->write(message); m_serial->waitForBytesWritten(-1); m_serial->close(); } }
output:
binary 10101010
supposed to be binary "\xAA"the problem is that if i send 0xAA through the port, it fails to trigger a green LED on my MCU, it responds to 0xFF by toggling a green LED , but the same should happen with 0xAA , with 0xFF the LED functions as expected, but 0xAA has no effect.
both 0xFF and 0xAA succeed in toggling the green LED if used with a terminal like Hterm or Teraterm.
Thats why i want to see whats inside the QByteArray after it has the message inside
-
@agmar said in listing values in QByte:
when i change the value of tester to 0x55, which should have the opposite bits of 0xAA , why is there an U in there
Because 0x55 is ASCII representation for character U (https://www.asciitable.com).
Please learn how computers represent data."i wish to see the binary value of the QByteArray in binary once it has the unsigned char tester appended" - if you want to see binary (1010...) then print binary as you already do with qDebug() << "binary" << Qt::bin << tester ;
@jsulm
binary 1010101
supposed to be binary "U"binary 11111111
supposed to be binary "\xFF"i know how computers represent data, but do you notice that the same code makes the computer decide that sometimes it gives an U when the stored value is 0x55 and the actual hex value when i replace tester with 0xFF? if it is consistent, it should give an y with 2 dots , also , the 0 in the front is missing on the 0x55 to binary conversion, that is why i want to know the value inside the array after it has the value of tested appended, not the value i append, because i already know it is correct...
I am not sure, this might be because i used an unsigned char for the tester value?
-
@jsulm
binary 1010101
supposed to be binary "U"binary 11111111
supposed to be binary "\xFF"i know how computers represent data, but do you notice that the same code makes the computer decide that sometimes it gives an U when the stored value is 0x55 and the actual hex value when i replace tester with 0xFF? if it is consistent, it should give an y with 2 dots , also , the 0 in the front is missing on the 0x55 to binary conversion, that is why i want to know the value inside the array after it has the value of tested appended, not the value i append, because i already know it is correct...
I am not sure, this might be because i used an unsigned char for the tester value?
@agmar
You are simply making assumptions about howqDebug()
displays things to the user which are not the case.
Both I & @mpergand have now told you what to use for the representation you desire, so why not try it?P.S.
makes the computer decide that sometimes it gives an U when the stored value is 0x55 and the actual hex value when i replace tester with 0xFF?
If you really want to know:
qDebug() << QByteArray
will show those bytes which have an ASCII printable character (0x20--0x7F) as characters (e.g. 0x55 ->U
) and those that don't in hex (e.g. 0xFF ->0xFF
). This is (supposed) to help the user becauseQByteArray
is used in many places where it actually holds printable characters and Qt has a bit of schizophrenia over whetherQByteArray
is really being used to hold characters rather than arbitrary binary bytes (e.g. see how it appends a\0
to data stored in it). -
@jsulm
binary 1010101
supposed to be binary "U"binary 11111111
supposed to be binary "\xFF"i know how computers represent data, but do you notice that the same code makes the computer decide that sometimes it gives an U when the stored value is 0x55 and the actual hex value when i replace tester with 0xFF? if it is consistent, it should give an y with 2 dots , also , the 0 in the front is missing on the 0x55 to binary conversion, that is why i want to know the value inside the array after it has the value of tested appended, not the value i append, because i already know it is correct...
I am not sure, this might be because i used an unsigned char for the tester value?
@agmar said in listing values in QByte:
supposed to be binary "U"
What does this mean?!
As already explained "U" is NOT binary! It is also NOT a valid hex digit.1010101 - is 0x55 in hex, not U
-
@jsulm
binary 1010101
supposed to be binary "U"binary 11111111
supposed to be binary "\xFF"i know how computers represent data, but do you notice that the same code makes the computer decide that sometimes it gives an U when the stored value is 0x55 and the actual hex value when i replace tester with 0xFF? if it is consistent, it should give an y with 2 dots , also , the 0 in the front is missing on the 0x55 to binary conversion, that is why i want to know the value inside the array after it has the value of tested appended, not the value i append, because i already know it is correct...
I am not sure, this might be because i used an unsigned char for the tester value?
-
@agmar said in listing values in QByte:
supposed to be binary "U"
What does this mean?!
As already explained "U" is NOT binary! It is also NOT a valid hex digit.1010101 - is 0x55 in hex, not U
@jsulm
fig1
this is what i meant.
fig2i did not imply that U was hex, not binary, the debug comment clearly says "supposed to be binary" indicating that it is not .
My confusion arises here, how come qDebug(), with the exact same code gives :
-
in fig 1, a HEX representation, when the value inside tester is 0xAA
-
in fig2, an ASCII character representation of the hex value when tester is 0x55
Hopefully more clear now. again, this is the same code in both instances, with only the value of tester changed...
-
-
@jsulm
fig1
this is what i meant.
fig2i did not imply that U was hex, not binary, the debug comment clearly says "supposed to be binary" indicating that it is not .
My confusion arises here, how come qDebug(), with the exact same code gives :
-
in fig 1, a HEX representation, when the value inside tester is 0xAA
-
in fig2, an ASCII character representation of the hex value when tester is 0x55
Hopefully more clear now. again, this is the same code in both instances, with only the value of tester changed...
@agmar said in listing values in QByte:
My confusion arises here, how come qDebug(), with the exact same code gives :
in fig 1, a HEX representation, when the value inside tester is 0xAA
in fig2, an ASCII character representation of the hex value when tester is 0x55
Did you read my previous answer where I explained just this?
Have you also appreciated that you ask about (a) the representation of a
quint8
and (b) the representation of a byte in aQByteArray
, and assumeqDebug()
will output them the same, but it does not.Finally you now know what you need to use to get what you desire (always hex representation, right?), don't you? So not sure what you are asking now?
-
-
@jsulm
fig1
this is what i meant.
fig2i did not imply that U was hex, not binary, the debug comment clearly says "supposed to be binary" indicating that it is not .
My confusion arises here, how come qDebug(), with the exact same code gives :
-
in fig 1, a HEX representation, when the value inside tester is 0xAA
-
in fig2, an ASCII character representation of the hex value when tester is 0x55
Hopefully more clear now. again, this is the same code in both instances, with only the value of tester changed...
@agmar OK, I misunderstood you, sorry!
I think you see this difference because 0xAA is extended ASCII and I guess qDebug() prints its hex value, 0x55 is ASCII and qDebug() prints it as ASCII character.
What you should try: do not print the byte array as a whole, print each byte:qDebug() << Qt::bin << message[0];
-
-
@agmar
You are simply making assumptions about howqDebug()
displays things to the user which are not the case.
Both I & @mpergand have now told you what to use for the representation you desire, so why not try it?P.S.
makes the computer decide that sometimes it gives an U when the stored value is 0x55 and the actual hex value when i replace tester with 0xFF?
If you really want to know:
qDebug() << QByteArray
will show those bytes which have an ASCII printable character (0x20--0x7F) as characters (e.g. 0x55 ->U
) and those that don't in hex (e.g. 0xFF ->0xFF
). This is (supposed) to help the user becauseQByteArray
is used in many places where it actually holds printable characters and Qt has a bit of schizophrenia over whetherQByteArray
is really being used to hold characters rather than arbitrary binary bytes (e.g. see how it appends a\0
to data stored in it).@JonB
well... the funny thing is that when i did this before it did give this weird looking letter - ÿ
that implies that it can print 0xFF as an ASCII character, because if i use a hex to ASCII converter, this is what i actually get :
identical, but maybe i did something differently, who knows... and from my last reply to jsulm, you can see that by casting the QByteArray values i get the correct aa output, i think theres something wrong with the serial...which gives me an answer to the previous question you gave:
"Binary is digits in the range 0-1. Just as hex is digits in the range 0-F. There is no such thing as binary "U". "why is the first 0 of the binary not there?" what "first 0"? Do you write, say, the decimal number 99 as 099? No, why would you show leading zeroes?"if it doesnt send that first digit, the data is corrupt, even tho i did
m_serial->setDataBits(QSerialPort::Data8);
when setting up, its easy to see i am lost because i am new to this, i can not assume that it actually sends EVERYTHING, if it decides to not show me the actual 01010101 instead of 1010101... -
@JonB
well... the funny thing is that when i did this before it did give this weird looking letter - ÿ
that implies that it can print 0xFF as an ASCII character, because if i use a hex to ASCII converter, this is what i actually get :
identical, but maybe i did something differently, who knows... and from my last reply to jsulm, you can see that by casting the QByteArray values i get the correct aa output, i think theres something wrong with the serial...which gives me an answer to the previous question you gave:
"Binary is digits in the range 0-1. Just as hex is digits in the range 0-F. There is no such thing as binary "U". "why is the first 0 of the binary not there?" what "first 0"? Do you write, say, the decimal number 99 as 099? No, why would you show leading zeroes?"if it doesnt send that first digit, the data is corrupt, even tho i did
m_serial->setDataBits(QSerialPort::Data8);
when setting up, its easy to see i am lost because i am new to this, i can not assume that it actually sends EVERYTHING, if it decides to not show me the actual 01010101 instead of 1010101...@agmar
You are now talking about the behaviour of QSerialPort::setDataBits(QSerialPort::DataBits dataBits)The default value is Data8, i.e. 8 data bits.
qDebug()
does not know you intend to use the value to set 8 data bits later on.I still don't know what you actually want! If, for example, you want every byte printed as 8 binary digits, with leading zeroes, (is this it??) you will have to convert that value into a string of the desired 8 characters and ask
qDebug()
to print that. For that look at the various QString QString::arg(uint a, int fieldWidth = 0, int base = 10, QChar fillChar = u' ') const overloads, e.g. I thinkQString("%1").arg(message[0], 8, 2, '0');
-
@agmar
You are now talking about the behaviour of QSerialPort::setDataBits(QSerialPort::DataBits dataBits)The default value is Data8, i.e. 8 data bits.
qDebug()
does not know you intend to use the value to set 8 data bits later on.I still don't know what you actually want! If, for example, you want every byte printed as 8 binary digits, with leading zeroes, (is this it??) you will have to convert that value into a string of the desired 8 characters and ask
qDebug()
to print that. For that look at the various QString QString::arg(uint a, int fieldWidth = 0, int base = 10, QChar fillChar = u' ') const overloads, e.g. I thinkQString("%1").arg(message[0], 8, 2, '0');
@JonB
i will take note, that is good to know, but i have now verified that the message is not corrupted when i append it into QByteArray due to the front 0's missing, which i had hoped, since it would be easier to fix...however, the problem now shifts to the output stage, which shows me that there is something wrong with the serial port because it sends 0xFF correctly, but 0xAA does not get transmitted as it should be , which makes no sense, because i(once again) use the same code for both, so i need to figure out why QT likes sending 0xFF , but hates sending 0xAA... ._.
-
@JonB
i will take note, that is good to know, but i have now verified that the message is not corrupted when i append it into QByteArray due to the front 0's missing, which i had hoped, since it would be easier to fix...however, the problem now shifts to the output stage, which shows me that there is something wrong with the serial port because it sends 0xFF correctly, but 0xAA does not get transmitted as it should be , which makes no sense, because i(once again) use the same code for both, so i need to figure out why QT likes sending 0xFF , but hates sending 0xAA... ._.
-
-
-
There never was going to be any "corruption".
-
so i need to figure out why QT likes sending 0xFF , but hates sending 0xAA... .
This just is not the case.
Will leave you to it now.
-