I’ve been porting a large project to the XboxOne on the UWP platform. Our game uses “D3D”, rather than “XAML”, for performance reasons. I’ve been working with Unity 5.6.3p4 mostly, though all of the above seems to be true for 2017.2 as well.
I submitted a bug for an issue I’ve been struggling with: on this platform, selecting an InputField does not launch the On-Screen Keyboard (OSK) for text entry. Unity launches a touch keyboard automatically on mobile platforms (including, I think, UWP on Windows Phone)… but it definitely does not automatically handle text entry on Xbox One.
While TouchScreenKeyboard.isSupported reports FALSE on this platform, I found that using TouchScreenKeyboard.Open() does work for opening the OSK on this platform and retrieving the entered text after the user presses the Menu/submit button.
Using this to create a workaround seems like it would be trivial; and indeed doing that is what Unity rep Tautvydas-Zilys suggests in this forum thread .
However, there are two major issues I’ve had trying this, which aren’t discussed elsewhere:
- Cancelling the keyboard with the “B” button is completely not recognized by Unity. The keyboard disappears, and no other controller input is accepted; but the TouchScreenKeyboard instance, and the static variables, are all unchanged when this happens, so there’s no way to detect this state. (Tried the “visible”, “active”, “done”, “status”, and “wasCanceled” flags, to no success.)
- The OSK’s text entry field is not shown (ie the white box above the keyboard that shows the text being entered). For our InputFields on the lower half of the screen, this means the player can’t read the text they’re entering.
So my question (preferably for a Unity dev) is: I successfully created a native plugin to let me fix #1. Do I have to do the same to fix #2?
More importantly, does Unity plan to implement this missing functionality? This is important UI functionality, and I assume the relatively few mentions of it on the forums only because there aren’t very many XboxOne UWP games made in Unity. The keyboard can obviously be launched and used to some extent since TouchScreenKeyboard (mostly) functions; why isn’t the full functionality there to do all this automatically, without the need for a native plugin and a workaround script?
This issue seems to imply that it’s “by design” that D3D apps in UWP are unable to support the onscreen keyboard, and that this is only supported on XAML. However we need to use D3D, and we need this functionality.
I hope a Unity dev can advise on the right way to go here, or at least someone who’s dealt with this problem and can confirm I’m going down a reasonable path with this workaround.