Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
  • Search
  • Get Qt Extensions
  • Unsolved
Collapse
Brand Logo
  1. Home
  2. Qt Development
  3. General and Desktop
  4. listing values in QByte
Forum Updated to NodeBB v4.3 + New Features

listing values in QByte

Scheduled Pinned Locked Moved Solved General and Desktop
25 Posts 5 Posters 3.5k Views 2 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • A agmar

    @JonB

    I set the value of ''tester'' in hex to begin with14e82dc4-ff1d-4fc7-a32c-261d1976c873-image.png , so i do not need to see the hex value inside the QByteArray, i need the binary value of the message once it has the "tester" value appended.

    There seems to be something fishy going on with that array because of the following output:

    binary 1010101
    supposed to be binary "U"

    when i change the value of tester to 0x55, which should have the opposite bits of 0xAA , why is there an U in there and why is the first 0 of the binary not there?

    put simply, i wish to see the binary value of the QByteArray in binary once it has the unsigned char tester appended...

    JonBJ Online
    JonBJ Online
    JonB
    wrote on last edited by JonB
    #11

    @agmar
    I don't know what you are asking/thinking, and I don't think anyone else will.

    binary 1010101

    supposed to be binary "U"

    Binary is digits in the range 0-1. Just as hex is digits in the range 0-F. There is no such thing as binary "U". "why is the first 0 of the binary not there?" what "first 0"? Do you write, say, the decimal number 99 as 099? No, why would you show leading zeroes?

    There is nothing "fishy" going on.

    Oh, 0x55 might be the value for letter U? qDebug() puts interpretation on how to show the characters in a QByteArray. If you want the content of one shown as hex digits why not use QByteArray QByteArray::toHex(char separator = '\0') const, maybe that is what you are looking for?

    1 Reply Last reply
    2
    • A agmar

      @JonB

      I set the value of ''tester'' in hex to begin with14e82dc4-ff1d-4fc7-a32c-261d1976c873-image.png , so i do not need to see the hex value inside the QByteArray, i need the binary value of the message once it has the "tester" value appended.

      There seems to be something fishy going on with that array because of the following output:

      binary 1010101
      supposed to be binary "U"

      when i change the value of tester to 0x55, which should have the opposite bits of 0xAA , why is there an U in there and why is the first 0 of the binary not there?

      put simply, i wish to see the binary value of the QByteArray in binary once it has the unsigned char tester appended...

      jsulmJ Online
      jsulmJ Online
      jsulm
      Lifetime Qt Champion
      wrote on last edited by
      #12

      @agmar said in listing values in QByte:

      when i change the value of tester to 0x55, which should have the opposite bits of 0xAA , why is there an U in there

      Because 0x55 is ASCII representation for character U (https://www.asciitable.com).
      Please learn how computers represent data.

      "i wish to see the binary value of the QByteArray in binary once it has the unsigned char tester appended" - if you want to see binary (1010...) then print binary as you already do with qDebug() << "binary" << Qt::bin << tester ;

      https://forum.qt.io/topic/113070/qt-code-of-conduct

      A 1 Reply Last reply
      1
      • A agmar

        @Christian-Ehrlicher

        is there anyway to do it with a QByteArray? it does not give me the output i expect

        void MainWindow::writeSerial()
        {
            QByteArray message;
        
           message.append(tester)  ;
        
           qDebug() << "binary" << Qt::bin  << tester ;
           qDebug() << "supposed to be binary" << Qt::bin  << message;
               if (m_serial->open(QIODevice::ReadWrite)) {
                   m_serial->write(message);
                   m_serial->waitForBytesWritten(-1);
                   m_serial->close();
        
        }
        }
        

        output:
        binary 10101010
        supposed to be binary "\xAA"

        the problem is that if i send 0xAA through the port, it fails to trigger a green LED on my MCU, it responds to 0xFF by toggling a green LED , but the same should happen with 0xAA , with 0xFF the LED functions as expected, but 0xAA has no effect.

        both 0xFF and 0xAA succeed in toggling the green LED if used with a terminal like Hterm or Teraterm.

        Thats why i want to see whats inside the QByteArray after it has the message inside

        M Offline
        M Offline
        mpergand
        wrote on last edited by
        #13

        @agmar said in listing values in QByte:

        s there anyway to do it with a QByteArray?

        qDebug().noquote()<<message.toHex(' ');
        
        1 Reply Last reply
        1
        • jsulmJ jsulm

          @agmar said in listing values in QByte:

          when i change the value of tester to 0x55, which should have the opposite bits of 0xAA , why is there an U in there

          Because 0x55 is ASCII representation for character U (https://www.asciitable.com).
          Please learn how computers represent data.

          "i wish to see the binary value of the QByteArray in binary once it has the unsigned char tester appended" - if you want to see binary (1010...) then print binary as you already do with qDebug() << "binary" << Qt::bin << tester ;

          A Offline
          A Offline
          agmar
          wrote on last edited by
          #14

          @jsulm
          binary 1010101
          supposed to be binary "U"

          binary 11111111
          supposed to be binary "\xFF"

          i know how computers represent data, but do you notice that the same code makes the computer decide that sometimes it gives an U when the stored value is 0x55 and the actual hex value when i replace tester with 0xFF? if it is consistent, it should give an y with 2 dots , also , the 0 in the front is missing on the 0x55 to binary conversion, that is why i want to know the value inside the array after it has the value of tested appended, not the value i append, because i already know it is correct...

          I am not sure, this might be because i used an unsigned char for the tester value?

          JonBJ jsulmJ M 3 Replies Last reply
          0
          • A agmar

            @jsulm
            binary 1010101
            supposed to be binary "U"

            binary 11111111
            supposed to be binary "\xFF"

            i know how computers represent data, but do you notice that the same code makes the computer decide that sometimes it gives an U when the stored value is 0x55 and the actual hex value when i replace tester with 0xFF? if it is consistent, it should give an y with 2 dots , also , the 0 in the front is missing on the 0x55 to binary conversion, that is why i want to know the value inside the array after it has the value of tested appended, not the value i append, because i already know it is correct...

            I am not sure, this might be because i used an unsigned char for the tester value?

            JonBJ Online
            JonBJ Online
            JonB
            wrote on last edited by JonB
            #15

            @agmar
            You are simply making assumptions about how qDebug() displays things to the user which are not the case.
            Both I & @mpergand have now told you what to use for the representation you desire, so why not try it?

            P.S.

            makes the computer decide that sometimes it gives an U when the stored value is 0x55 and the actual hex value when i replace tester with 0xFF?

            If you really want to know: qDebug() << QByteArray will show those bytes which have an ASCII printable character (0x20--0x7F) as characters (e.g. 0x55 -> U) and those that don't in hex (e.g. 0xFF -> 0xFF). This is (supposed) to help the user because QByteArray is used in many places where it actually holds printable characters and Qt has a bit of schizophrenia over whether QByteArray is really being used to hold characters rather than arbitrary binary bytes (e.g. see how it appends a \0 to data stored in it).

            A 1 Reply Last reply
            1
            • A agmar

              @jsulm
              binary 1010101
              supposed to be binary "U"

              binary 11111111
              supposed to be binary "\xFF"

              i know how computers represent data, but do you notice that the same code makes the computer decide that sometimes it gives an U when the stored value is 0x55 and the actual hex value when i replace tester with 0xFF? if it is consistent, it should give an y with 2 dots , also , the 0 in the front is missing on the 0x55 to binary conversion, that is why i want to know the value inside the array after it has the value of tested appended, not the value i append, because i already know it is correct...

              I am not sure, this might be because i used an unsigned char for the tester value?

              jsulmJ Online
              jsulmJ Online
              jsulm
              Lifetime Qt Champion
              wrote on last edited by
              #16

              @agmar said in listing values in QByte:

              supposed to be binary "U"

              What does this mean?!
              As already explained "U" is NOT binary! It is also NOT a valid hex digit.

              1010101 - is 0x55 in hex, not U

              https://forum.qt.io/topic/113070/qt-code-of-conduct

              A 1 Reply Last reply
              1
              • A agmar

                @jsulm
                binary 1010101
                supposed to be binary "U"

                binary 11111111
                supposed to be binary "\xFF"

                i know how computers represent data, but do you notice that the same code makes the computer decide that sometimes it gives an U when the stored value is 0x55 and the actual hex value when i replace tester with 0xFF? if it is consistent, it should give an y with 2 dots , also , the 0 in the front is missing on the 0x55 to binary conversion, that is why i want to know the value inside the array after it has the value of tested appended, not the value i append, because i already know it is correct...

                I am not sure, this might be because i used an unsigned char for the tester value?

                M Offline
                M Offline
                mpergand
                wrote on last edited by
                #17

                @agmar
                Look at this topic:
                https://forum.qt.io/topic/138672/how-to-convert-qbytearray-to-hex-array

                1 Reply Last reply
                1
                • jsulmJ jsulm

                  @agmar said in listing values in QByte:

                  supposed to be binary "U"

                  What does this mean?!
                  As already explained "U" is NOT binary! It is also NOT a valid hex digit.

                  1010101 - is 0x55 in hex, not U

                  A Offline
                  A Offline
                  agmar
                  wrote on last edited by
                  #18

                  @jsulm
                  eb732acf-fb78-4cfb-84d0-104e076589fa-image.png
                  fig1
                  this is what i meant.
                  aec6c5e9-7eda-4228-9c83-12a7c7a0b0e8-image.png
                  fig2

                  i did not imply that U was hex, not binary, the debug comment clearly says "supposed to be binary" indicating that it is not .

                  My confusion arises here, how come qDebug(), with the exact same code gives :

                  1. in fig 1, a HEX representation, when the value inside tester is 0xAA

                  2. in fig2, an ASCII character representation of the hex value when tester is 0x55

                  Hopefully more clear now. again, this is the same code in both instances, with only the value of tester changed...

                  JonBJ jsulmJ 2 Replies Last reply
                  0
                  • A agmar

                    @jsulm
                    eb732acf-fb78-4cfb-84d0-104e076589fa-image.png
                    fig1
                    this is what i meant.
                    aec6c5e9-7eda-4228-9c83-12a7c7a0b0e8-image.png
                    fig2

                    i did not imply that U was hex, not binary, the debug comment clearly says "supposed to be binary" indicating that it is not .

                    My confusion arises here, how come qDebug(), with the exact same code gives :

                    1. in fig 1, a HEX representation, when the value inside tester is 0xAA

                    2. in fig2, an ASCII character representation of the hex value when tester is 0x55

                    Hopefully more clear now. again, this is the same code in both instances, with only the value of tester changed...

                    JonBJ Online
                    JonBJ Online
                    JonB
                    wrote on last edited by
                    #19

                    @agmar said in listing values in QByte:

                    My confusion arises here, how come qDebug(), with the exact same code gives :

                    in fig 1, a HEX representation, when the value inside tester is 0xAA

                    in fig2, an ASCII character representation of the hex value when tester is 0x55

                    Did you read my previous answer where I explained just this?

                    Have you also appreciated that you ask about (a) the representation of a quint8 and (b) the representation of a byte in a QByteArray, and assume qDebug() will output them the same, but it does not.

                    Finally you now know what you need to use to get what you desire (always hex representation, right?), don't you? So not sure what you are asking now?

                    1 Reply Last reply
                    1
                    • A agmar

                      @jsulm
                      eb732acf-fb78-4cfb-84d0-104e076589fa-image.png
                      fig1
                      this is what i meant.
                      aec6c5e9-7eda-4228-9c83-12a7c7a0b0e8-image.png
                      fig2

                      i did not imply that U was hex, not binary, the debug comment clearly says "supposed to be binary" indicating that it is not .

                      My confusion arises here, how come qDebug(), with the exact same code gives :

                      1. in fig 1, a HEX representation, when the value inside tester is 0xAA

                      2. in fig2, an ASCII character representation of the hex value when tester is 0x55

                      Hopefully more clear now. again, this is the same code in both instances, with only the value of tester changed...

                      jsulmJ Online
                      jsulmJ Online
                      jsulm
                      Lifetime Qt Champion
                      wrote on last edited by jsulm
                      #20

                      @agmar OK, I misunderstood you, sorry!
                      I think you see this difference because 0xAA is extended ASCII and I guess qDebug() prints its hex value, 0x55 is ASCII and qDebug() prints it as ASCII character.
                      What you should try: do not print the byte array as a whole, print each byte:

                      qDebug() << Qt::bin << message[0];
                      

                      https://forum.qt.io/topic/113070/qt-code-of-conduct

                      1 Reply Last reply
                      1
                      • JonBJ JonB

                        @agmar
                        You are simply making assumptions about how qDebug() displays things to the user which are not the case.
                        Both I & @mpergand have now told you what to use for the representation you desire, so why not try it?

                        P.S.

                        makes the computer decide that sometimes it gives an U when the stored value is 0x55 and the actual hex value when i replace tester with 0xFF?

                        If you really want to know: qDebug() << QByteArray will show those bytes which have an ASCII printable character (0x20--0x7F) as characters (e.g. 0x55 -> U) and those that don't in hex (e.g. 0xFF -> 0xFF). This is (supposed) to help the user because QByteArray is used in many places where it actually holds printable characters and Qt has a bit of schizophrenia over whether QByteArray is really being used to hold characters rather than arbitrary binary bytes (e.g. see how it appends a \0 to data stored in it).

                        A Offline
                        A Offline
                        agmar
                        wrote on last edited by
                        #21

                        @JonB
                        well... the funny thing is that when i did this before it did give this weird looking letter - ÿ
                        that implies that it can print 0xFF as an ASCII character, because if i use a hex to ASCII converter, this is what i actually get :
                        3518ddfa-c4ba-49f1-83bf-b13709eb32ef-image.png
                        identical, but maybe i did something differently, who knows... and from my last reply to jsulm, you can see that by casting the QByteArray values i get the correct aa output, i think theres something wrong with the serial...

                        which gives me an answer to the previous question you gave:
                        "Binary is digits in the range 0-1. Just as hex is digits in the range 0-F. There is no such thing as binary "U". "why is the first 0 of the binary not there?" what "first 0"? Do you write, say, the decimal number 99 as 099? No, why would you show leading zeroes?"

                        if it doesnt send that first digit, the data is corrupt, even tho i did
                        m_serial->setDataBits(QSerialPort::Data8);
                        when setting up, its easy to see i am lost because i am new to this, i can not assume that it actually sends EVERYTHING, if it decides to not show me the actual 01010101 instead of 1010101...

                        JonBJ 1 Reply Last reply
                        0
                        • A agmar

                          @JonB
                          well... the funny thing is that when i did this before it did give this weird looking letter - ÿ
                          that implies that it can print 0xFF as an ASCII character, because if i use a hex to ASCII converter, this is what i actually get :
                          3518ddfa-c4ba-49f1-83bf-b13709eb32ef-image.png
                          identical, but maybe i did something differently, who knows... and from my last reply to jsulm, you can see that by casting the QByteArray values i get the correct aa output, i think theres something wrong with the serial...

                          which gives me an answer to the previous question you gave:
                          "Binary is digits in the range 0-1. Just as hex is digits in the range 0-F. There is no such thing as binary "U". "why is the first 0 of the binary not there?" what "first 0"? Do you write, say, the decimal number 99 as 099? No, why would you show leading zeroes?"

                          if it doesnt send that first digit, the data is corrupt, even tho i did
                          m_serial->setDataBits(QSerialPort::Data8);
                          when setting up, its easy to see i am lost because i am new to this, i can not assume that it actually sends EVERYTHING, if it decides to not show me the actual 01010101 instead of 1010101...

                          JonBJ Online
                          JonBJ Online
                          JonB
                          wrote on last edited by JonB
                          #22

                          @agmar
                          You are now talking about the behaviour of QSerialPort::setDataBits(QSerialPort::DataBits dataBits)

                          The default value is Data8, i.e. 8 data bits.

                          qDebug() does not know you intend to use the value to set 8 data bits later on.

                          I still don't know what you actually want! If, for example, you want every byte printed as 8 binary digits, with leading zeroes, (is this it??) you will have to convert that value into a string of the desired 8 characters and ask qDebug() to print that. For that look at the various QString QString::arg(uint a, int fieldWidth = 0, int base = 10, QChar fillChar = u' ') const overloads, e.g. I think

                          QString("%1").arg(message[0], 8, 2, '0');
                          
                          A 1 Reply Last reply
                          1
                          • JonBJ JonB

                            @agmar
                            You are now talking about the behaviour of QSerialPort::setDataBits(QSerialPort::DataBits dataBits)

                            The default value is Data8, i.e. 8 data bits.

                            qDebug() does not know you intend to use the value to set 8 data bits later on.

                            I still don't know what you actually want! If, for example, you want every byte printed as 8 binary digits, with leading zeroes, (is this it??) you will have to convert that value into a string of the desired 8 characters and ask qDebug() to print that. For that look at the various QString QString::arg(uint a, int fieldWidth = 0, int base = 10, QChar fillChar = u' ') const overloads, e.g. I think

                            QString("%1").arg(message[0], 8, 2, '0');
                            
                            A Offline
                            A Offline
                            agmar
                            wrote on last edited by
                            #23

                            @JonB
                            i will take note, that is good to know, but i have now verified that the message is not corrupted when i append it into QByteArray due to the front 0's missing, which i had hoped, since it would be easier to fix...

                            however, the problem now shifts to the output stage, which shows me that there is something wrong with the serial port because it sends 0xFF correctly, but 0xAA does not get transmitted as it should be , which makes no sense, because i(once again) use the same code for both, so i need to figure out why QT likes sending 0xFF , but hates sending 0xAA... ._.

                            JonBJ 1 Reply Last reply
                            0
                            • A agmar

                              @JonB
                              i will take note, that is good to know, but i have now verified that the message is not corrupted when i append it into QByteArray due to the front 0's missing, which i had hoped, since it would be easier to fix...

                              however, the problem now shifts to the output stage, which shows me that there is something wrong with the serial port because it sends 0xFF correctly, but 0xAA does not get transmitted as it should be , which makes no sense, because i(once again) use the same code for both, so i need to figure out why QT likes sending 0xFF , but hates sending 0xAA... ._.

                              JonBJ Online
                              JonBJ Online
                              JonB
                              wrote on last edited by
                              #24

                              @agmar

                              • There never was going to be any "corruption".

                              • so i need to figure out why QT likes sending 0xFF , but hates sending 0xAA... .

                              This just is not the case.

                              Will leave you to it now.

                              A 1 Reply Last reply
                              1
                              • A agmar has marked this topic as solved on
                              • JonBJ JonB

                                @agmar

                                • There never was going to be any "corruption".

                                • so i need to figure out why QT likes sending 0xFF , but hates sending 0xAA... .

                                This just is not the case.

                                Will leave you to it now.

                                A Offline
                                A Offline
                                agmar
                                wrote on last edited by
                                #25

                                @JonB @jsulm @Christian-Ehrlicher @mpergand thanks for your help, i appreciate you

                                1 Reply Last reply
                                1

                                • Login

                                • Login or register to search.
                                • First post
                                  Last post
                                0
                                • Categories
                                • Recent
                                • Tags
                                • Popular
                                • Users
                                • Groups
                                • Search
                                • Get Qt Extensions
                                • Unsolved