I’ve been building my project testing with an Oculus Rift, but I also want to support OpenXR. All of my input thus far has been using calls of the general form
device = InputDevices.GetDeviceAtXRNode(node);
float value = 0;
device.TryGetFeatureValue(usage, out value);
// yes, I know, I've left some stuff out
As-is, the game works just fine both in the editor and a build with the Rift. However, when I try to enable support for OpenXR, it gives me the error “This project is using the new input system package but the native platform backends for the new input system are not enabled in the player settings. This means that no input from native devices will come through.” The documentation on this talks about creating an input asset and creating actions and whatnot. Do I need to go through all that if I’m already using the above code gratuitously? In order to test on a Vive, I need to physically travel to a different location, so I want to get it right the first time. If I enable these “backends”, what am I going to need to do to make sure my game keeps functioning correctly on both platforms? Do I need to restructure my input, or can I just enable the backends and leave everything else as-is using the function calls above?