A way to programmatically generate a string literal (is that even the right word)?
-
I am trying to understand if there is a simpler way of converting from utf32 to utf16 as used in QML strings (I think). I am playing with some code used to calculate the surrogates of emojis:
import QtQuick 2.12 import QtQuick.Window 2.12 Window { id: window visible: true width: 640 height: 480 // from utf32 to utf16 // https://unicodebook.readthedocs.io/unicode_encodings.html#surrogates function surrogate(input){ var output if(input < 0x10000 || input > 0x10FFFF){ return output } var code = input - 0x10000 var low = 0xD800 | (code >> 10) var high = 0xDC00 | (code & 0x3FF) output = String.fromCharCode(low) + String.fromCharCode(high) return output } function fromutf32(input){ return "\\u{%1}".arg(input.toString(16)) } title: qsTr("Font Testing 😀 %1 %2 %3").arg(surrogate(0x1F4A9)).arg("\u{1F4A9}").arg(fromutf32(0x1F4A9)) }
I know I could drop down to c++ for this and just use the functions there. I just find it interesting that all the emoji references provide numbers in utf32 (I think). I wanted a direct way to use these integers in my QML code. That was why I found and used that surrogate function. I also am trying to see if I can create the string that allows me to directly use the code in my code:
"\u{1F4A9}"
However I cannot figure out how to programmatically convert a string like from my fromutf32 function to the actual string produced if I type it out in a string constant. I tried eval, unescape, and String.raw. I just am not seeing a way to do this. So maybe the surrogate function is the best way to do this. Or like I said before, I should be dropping to c++ for this.