I spent an evening creating an emoji setup but when I finished I noticed that if in my game’s chat messages players used an emoji’s name such as it would grab the correct unicode value from my dictionary but just end up writing \U0001f914 into chat instead of replacing it with the emoji in the sprite sheet.
If I click pause while the chat message is up and I click anything in the inspector for the TextMeshProUGUI object, it immediately replaces the unicode value with the emoji. I thought maybe I was just missing some sort of parsing feature by directly modifying the .text member of the object in code so I used SetText instead and still no luck.
After that I tried SetAllDirty in hopes it would trigger an update and even tried triggering it a few frames later to see if that would have the same effect as clicking anything in the inspector (even “enable rtl editor” would cause the emoji to properly parse).
I’m assuming this is a bug of sorts or I’m missing some sort of “refresh text object” function that would replace unicode values written out with their actual characters.
Did anybody find a solution for this? I am passing strings from a TextFile with Unicode included and it always shows the unicode in the TextMeshPro until I interact with it in the editor…
Is there a method to update the TestMeshPro in Script so that it Refreshes the text content to understand unicode?
Please help out!
Same issue here. Looking at the source for TMP_Text, unicode characters are intentionally only handled when the text is changed via the inspector. (I.e., only when the textmesh’s internal m_inputSource enum is set to TextInputBox.) Attempting to set the text through code changes m_inputSource to TextString which bypasses unicode handling.
A way around this is to manually unescape the unicode characters before assigning them to the textmesh:
A downside to this approach is your now unescaped unicode characters may be invisible in the inspector text box, but they’ll render fine in-game. A formal fix to this problem would still be very appreciated!
Omg. This is unbelievable. I just spent 2 days troubleshooting why Font Awesome icons were not working through script, while working fine in the editor.
I have same issue but partly… If I apply some FA icons trought script they work but if I set them to \uf06a or \uf05a they apply as charachters?
Edit: It seems it work if you go tmp.text = “\uxxxx” or variable assign but only if variable is assigned on start and hardcoded u can’t edit variable in editor, if you do it displays chars.
just doing the Regex.Unescape partially work to get around the problem with unicode like this “\uE61F” but not for unicode like this “\U000F1901” with elements in UTF 32, i have tried to use other way to encode/decode but with no success
This forum post from 2017 __ unicode value shows as string litera l__ has an answer from @Stephan_B at the bottom. Extracted the string conversion portion, added below. In my case I have a single unicode character, set via inspector. I didn’t test the UTF-32 portion, hopefully that works just as well.
I don’t know why nobody has mentioned this yet, but if you just need to make changes in your text while the game is running, try to call emojis by indexes instead of Unicodes by It works for me.