QString and std::string
-
wrote on 27 Nov 2014, 15:01 last edited by
I am sure this has been discussed plenty of times before, however I just switched from Qt 4.7.something to Qt 5.3.2. I have read many articles, so I understand that in Qt 5 QStrings are stored in 16-bit Unicode (4.0). What I am failing to understand is why QString::fromStdString is using the "fromUtf8" conversion instead of the "fromLocal8Bit" conversion. In fact, why is the "fromUtf8" being used for all char* conversions to QString? Essentially I have a large amount of code which was using std::strings as an underlying storage type, and using Qt and QStrings for the top layer UI stuff. When displaying, I would convert to QStrings. This now has to be re-written since Qt 5.
I can't even instantiate a QString with a degree symbol, QString degreeSymbol("°"); without it turning into "�". Must I really do QString degreeSymbol(QString::fromLatin("°")); or QString degreeSymbol(QString::fromLocal8Bit("°")); ??
In on article someone mentioned setting the QTextCodec, however I investigated the documentation and it did not clearly indicate a way in which to solve the issues I am having.
Any help would be appreciated as I don't really want to go down the route of replacing every single interaction between QString and std::string or "char*".
Thanks!
-
wrote on 27 Nov 2014, 22:52 last edited by
It sounds like the problem is not in dealing with user-generated input, but in compiling your source code which contains extended characters such as the degree symbol. In that case, the problem surely arises from the character encoding of your source code files. Perhaps they are actually in Latin1, whereas Qt might assume that they are in UTF-8? If the source code was written on an earlier OS version and saved in the system locale encoding, then transferred to a later OS version which had UTF-8 as the standard encoding...??
Can you open the source files in a text editor that will tell you what the actual encoding is (e.g., UltraEdit)?
-
[quote author="bobhairgrove" date="1417128749"]whereas Qt might assume that they are in UTF-8?[/quote]
Not "might". Qt does assume that all source code is encoded in UTF-8.
-
wrote on 28 Nov 2014, 08:36 last edited by
Thanks, I didn't know that. Is this new in Qt 5? Because the OP says that his code used to work using Qt 4.
-
Yes. Qt 4 assumed C locale. Qt 5 assumes UTF-8. More info in Thiago's blog post: "link":http://www.macieira.org/blog/2012/05/source-code-must-be-utf-8-and-qstring-wants-it/.
-
wrote on 28 Nov 2014, 14:28 last edited by
Thanks sierdzio! That link explains it well enough. I can't say I agree with the decision, but I guess I am stuck with someone else's religious crusade.
-
wrote on 28 Nov 2014, 14:51 last edited by
your file is .c or .cpp ?
-
wrote on 28 Nov 2014, 14:53 last edited by
IMHO not a religious issue, but a fact of life. Just save your source files as UTF-8, and you'll be rid of this problem.
-
wrote on 28 Nov 2014, 15:50 last edited by
MSVC 2010, 2012, 2013 on windows 7. As a note for anyone else encountering this: go to File... Advanced Save Options... And select Encoding: Unicode (UTF-8 without signature) - Codepage 65001.
This needs to be done for EACH file.
1/9