How to convert a ByteArray.at() to an int



  • @
    void aa(QbyteArray data)
    { QChar year;
    QString str;
    year=data.at(2);
    str=QString("%1").arg(year.digitValue()+100,4,10,'0');
    }
    @

    got warning "ISO C++ says that these are ambiguous, even though the worst conversion for the first is better tahn the worst conversion for the second
    candidate 1:QString QString::arg(double,int, char,int, const Qchar &) const
    candidate 2:QString QString::arg(short int, int, int, const Qchar &) const



  • Take this example:

    @QByteArray x;
    x.resize(2);
    x[0] = 0x01;
    x[1] = 0x2A;

    int num = 0;
    num |= x[0];
    num << 8;
    num |= x[1];
    num << 8;
    qDebug() << "num = " << num;@
    

    Output:
    @num = 43 @

    Example grabbed from: "This Forum Post":http://www.qtforum.org/article/15603/converting-qbytearray-elements-into-integer.html#post63766

    I believe you could simply say:

    @void aa(QByteArray data)
    {
    //QChar year;
    int year = 0;
    year |= data[2];
    QString str;
    str=QString("%1").arg(year+100,4,10,'0');
    }@



  • thanks, but that does not solve the problem. after searching, adding QChar to '0' makes warning gone.

    str=QString("%1").arg(year+100,4,10,QChar('0'));



  • Maybe using QString can help, QString has a toIntro method.
    Something like this (I haven't tried, but it's just an idea):

    @QString numberAux = QByteArray.at(0);
    int number;
    bool ok = true;
    number = numberAux.toInt(&ok);
    if(!ok)
    // numberAux wasn't an integer
    return error;@


  • Moderators

    QByteArray::at() returns a char, which in ISO C++ is defined as a bit shorter than 2 bytes (0-127 instead of 255). Compilers complain that the conversion is ambiguous because of that. In most cases, warning can be ignored or a simple (int) conversion will do.



  • @int n = int(array.at(x));@

    or type conversion can be omitted:

    @int n = array.at(x);@

    it should return int in range -127 ... +128

    :)


  • Moderators

    @
    int n = (int) array.at(x);
    @

    Your solution may work, too.

    Edit: yeah, it should all be automatic. I think the explicit conversion does get rid of the warning, though. I might not remember this well, to be honest ;)



  • [quote author="sierdzio" date="1362643883"]QByteArray::at() returns a char, which in ISO C++ is defined as a bit shorter than 2 bytes (0-127 instead of 255)[/quote]

    Umm... what? A bit shorter than 2 bytes is 15 bits, which can encode 32768 values. In every C++ implementation known to me a char is 1 byte which is 8 bits.

    According to the C++ standard ISO:

    bq. Objects declared as characters (char) shall be large enough to store any member of the implementation’s basic character set. If a character from this set is stored in a character object, the integral value of that character object is equal to the value of the single character literal form of that character. It is implementation-defined whether a char object can hold negative values. Characters can be explicitly declared unsigned or signed. Plain char, signed char, and unsigned char are three distinct types. A char, a signed char, and an unsigned char occupy the same amount of storage and have the same alignment requirements that is, they have the same object representation. For character types, all bits of the object representation participate in the value representation. For unsigned character types, all possible bit patterns of the value representation represent numbers. These requirements do not hold for other types. In any particular implementation, a plain char object can take on either the same values as a signed char or an unsigned char; which one is implementation-defined.

    @shalongbasi - ignoring your example function that does absolutely nothing to an object outside of its scope, just cast your char to an int, which is completely safe since no information is lost, and this will resolve the ambiguity.


  • Moderators

    Doh, of course, utcenter. Sorry for not applying enough brain power to answering this ;)



  • BTW, the '0' final argument might very well be the source of ambiguity here. Don't know if the QChar constructor is explicit or there is just too much stuff that converts from a char, but replacing '0' with QChar('0') should solve it.


Log in to reply
 

Looks like your connection to Qt Forum was lost, please wait while we try to reconnect.