Unity XR Input: Possible to simulate input events via a fake InputDevice?

Hi,

I’d like to be able to simulate 6dof-tracked controller input in the Unity Editor when in play mode. I’m not aware of any headset emulation solutions so I’m developing my own (simple mouse-look for head motion plus WASD motion controls). To simulate the tracked controllers, I use the mouse and a script attached to the controller objects.

However, simulating button presses (e.g., grip, trigger, etc.) would I think require writing an InputDevice that the system could discover. Is there a way to do this? Or an alternative way I could go about writing a simulation environment?

Thank you,

Bart

Wrap the InputDevice queries in your own little module. Make the rest of your code go through that module rather than accessing InputDevice directly. Now you can have that module work by other means (standard Input) when running on desktop.

1 Like

Unfortunately I’m using Unity XR interaction toolkit, which provides various scripts that use InputDevice. These should not be modified. I’m wondering if there is a way to extend InputDevice and then inject that into the input system. If not, seems like a feature Unity should support?

Oh. Well yeah, can’t help you in that case. :slight_smile:

Seems like a good idea to me.

Unity staff on this forum have said that the interfaces for making custom VR Plugins are open to anyone, for free - XR Input Toolkit 2020 FAQ – Snap and Plug? - so I’d start there: opt-in for the SDK for making a custom plugin, and see if you can make a software-only InputDevice easily.

(and if not … I’d log bugs against the SDK stuff, because IMHO this use-case really should be supported, and it would be a benefit for Unity + everyone to have reference implementations / software mocks … but I suspect that this stuff might already be something they provide if you sign up for that, because it’s so useful)

1 Like

I really hope Unity adds a way to create mock inputs for custom HMD simulators! There should be a public interface for this.

For now, I found a kludge to accomplish what I want: introspection. My controller simulator script allows the controller to be moved around the screen using the mouse (with the left button held), forwards and backwards along the z axis using the scroll wheel, and can feed Unity XR interaction events into Unity’s XRController script (which must also be present on the object) by poking at its internals.

Specifically, there is an internal method named UpdateInteractionType() that is used for input playback, apparently. Obviously, this implementation detail can change at any time but it will probably be easier to maintain this approach than trying to edit the package scripts.

It’s not a great solution but it suffices for now.

using System;
using System.Collections.Generic;
using System.Reflection;
using UnityEngine;
using UnityEngine.XR.Interaction.Toolkit;

public class EditorControllerSimulator : MonoBehaviour
{
#if UNITY_EDITOR
  public float controllerDefaultDistance = 1;
  public float scrollWheelToDistance = 0.1f;
  public KeyCode selectKey = KeyCode.Mouse1;
  public KeyCode activateKey = KeyCode.KeypadEnter;

  private XRController m_xrController;
  private float m_distance = 0;

  private Type GetNestedType(object obj, string typeName)
  {
    foreach (var type in m_xrController.GetType().GetNestedTypes(BindingFlags.NonPublic | BindingFlags.Public))
    {
      if (type.Name == typeName)
      {
        return type;
      }
    }
    return null;
  }

  private Dictionary<string, object> GetEnumValues(Type enumType)
  {
    Debug.Assert(enumType.IsEnum == true);
    Dictionary<string, object> enumValues = new Dictionary<string, object>();
    foreach (object value in Enum.GetValues(enumType))
    {
      enumValues[Enum.GetName(enumType, value)] = value;
    }
    return enumValues;
  }

  private void UpdateXRControllerState(string interaction, KeyCode inputKey)
  {
    // Update interaction state
    bool state = Input.GetKey(inputKey);
    Type interactionTypes = GetNestedType(m_xrController, "InteractionTypes");
    Dictionary<string, object> interactionTypesEnum = GetEnumValues(interactionTypes);
    MethodInfo updateInteractionType = m_xrController.GetType().GetMethod("UpdateInteractionType", BindingFlags.NonPublic | BindingFlags.Instance);
    updateInteractionType.Invoke(m_xrController, new object[] { interactionTypesEnum[interaction], (object)state });
  }

  private void LateUpdate()
  {
    float scroll = Input.mouseScrollDelta.y;
    if (Input.GetMouseButton(0) || scroll != 0)
    {
      // Scroll wheel controls depth
      m_distance += scroll * scrollWheelToDistance;
      float depthOffset = controllerDefaultDistance + m_distance;

      // Mouse position sets position in XY plane at current depth
      Vector3 screenPos = Input.mousePosition;
      Ray ray = Camera.main.ScreenPointToRay(screenPos);
      Vector3 position = ray.origin + ray.direction * depthOffset;
      transform.position = position;
    }

    // Interaction states
    UpdateXRControllerState("select", selectKey);
    UpdateXRControllerState("activate", activateKey);
  }

  private void Awake()
  {
    m_xrController = GetComponent<XRController>();
  }
#endif
}
3 Likes

This is a good idea although sadly beyond the scope of what I have time for. However, filings bugs against the SDK might be more feasible.

Hi guys,

I’m still stuck on the same problem. For UI Events, it’s quite a big deal not to be able to fake inputs in editor. If anybody manage to succeed (thanks trzy for your code but I cannot yet manage to make it work), I’d be happy to have a workaround.

Since I haven’t seen the information here, Unity is working on a “simulated hmd” for editor in the short term roadmap. I’m quite sure I have seen this somewhere but I can no longer find the link. I will edit this when (if) I find it.

Have you guys looked into the Controller Recorder ? It seems to be a way too.

The simulated HMD: XR Input Toolkit 2020 FAQ – Snap and Plug?

2 Likes

I don’t know about UI Events – I actually have never used Unity’s UI systems – but I can share my simulation code for the XR Interaction Toolkit. It’s quite unpolished and it works by peering into the guts of XRController and manipulating internal methods directly. This is obviously brittle and if Unity changes the implementation of XRController substantially, will have to be adjusted accordingly.

The Unity interaction model has abstracted away buttons in favor of high-level actions (e.g., ‘select’ and ‘activate’). I allow you to map these to keyboard keys or mouse buttons. But as I also need access to the lower-level “trigger” and “grip” buttons, I have a small abstraction layer around those. It is up to you to ensure that “select”/“activate” are consistent with “trigger”/“grip” in the EditorControllerSimulator’s properties.

I’ve attached the scripts and below is a screenshot demonstrating how to wire them up. It’s a little funky for now.

Wiring Up the Scripts

  1. Make sure an XRController is present on both controllers.

  2. Include only one EditorControllerSimulator script on one of the controllers, not both. Why? Because I bind a key (“Switch Controller Key” property) that allows you to toggle between controllers. The script will automatically find the next controller and at start up, defaults to the controller it is attached to. It’s probably a good idea to modify the script so that it exists on its own object outside of the VR Rig and at start-up, either searches for an initial controller to grab or exposes a property you can set.

  3. Add ControllerInput to both controllers. I use this to get the button state and it’s just a thin wrapper over Unity’s functions. EditorControllerSimulator also uses this to inject fake button presses when playing from the editor (there is no way that I’m aware of to route simulated inputs to Unity’s API).

6003896--646649--Clipboard01.jpg

That’s it!

Usage Instructions

Hit ‘play’ and then hold the left mouse button to move the controller. You should see it respond. Use the scroll wheel to move it along the z axis. Press the right mouse button (Mouse 1) to simulate a “select”. Press Enter to simulate an “activate”. Likewise, these are mapped to simulate grip and trigger, respectively.

Press the back quote (tilde) key to switch control to the other controller.

6003896–646652–ControllerInput.cs (2.9 KB)
6003896–646655–EditorControllerSimulator.cs (3.85 KB)

4 Likes

@trzy Thanks a lot ! That’s a huge help.
I just needed to add the the UI press reference in the lateUpdate function (and the corresponding key definition) and it’s working like a charm !

Thanks to you I can avoid to go too deep into previewPackage code comprehension and loose a couple hours/days of brain overload. Here is a virtual cookie and a lot of sympathy =)

//Class begining
public KeyCode activateUI = KeyCode.E;

//LateUpdate
UpdateXRControllerState(“uiPress”, activateUI);

I actually released an VR Simulator for the XR Interaction toolkit to the assetstore in April but it kinda works the same way by updating the interaction type. So no need for getting it, if you guys got this solution to work. :wink: Planning on adding a couple more features though.

@trzy Thanks for your scripts.

I used them to create a VR simulator using a gamepad as input. You can find more information and the scripts in this thread: Building a VR simulator using a game pad as input for Unity XR

Hi!

Does @trzy 's solution work for everyone with canvases? I can see them react to the ray passing over the buttons and activating the hover, but pressing the grips or triggers won’t actually click them.

I think I might be able to provide a solution for simulating devices soon. Started building an XRInputSubsystem for ARSimulation that allows creating devices with usages on managed side. If anyone would be interested in testing feel free to send me a DM (as of now I only tested/built the plugin for windows)

Here is an example of how the API looks like right now if you want to create custom devices (some controller with one trigger in this case)

Devices injected that way are discovered via InputDevices.GetDevices(devices);

So for example this component just works as if you had an HMD connected but instead it’s just another transform in the editor
6388773--712086--upload_2020-10-6_16-53-21.png

So far I tested creating controllers and headsets that way:
6388773--712098--upload_2020-10-6_17-0-53.png

6388773--712101--upload_2020-10-6_17-1-3.png

I think I managed to get simulating hands working now too !!!
(The gizmos are drawn by collecting bone positions from getting InputFeatureUsage)

Edit: here are some videos where I tested it with some VR Rig that uses controller input x.com

Oh, so it is possible to create custom devices on the managed side? Interesting … Tried that a while ago but ran into several issues.

I am currently using Unitys XR SDK to create a native plugin for a custom XR Provider. It’s working quite well, you even get Input signals via the legacy input helpers and generic XR Input bindings with the new Input System. But in the end it’s basically doing the same, routing inputs to a custom XR Device and controlling it. I’ll soon transition my VR Simulator Asset to this new System, so it’s not as bound to XR Interaction Toolkit anymore. I’ll probably also release a stripped down free version of my asset in the process.
If anyone is interested in doing something similiar, I can give some directions. :slight_smile:

1 Like

It is or will be with my plugin (I also built a XRPluginSubsystem). What kind of issues did you have?

I built it in a way so I can create and control devices on managed side, that way I dont have to touch unmanaged again if I need another type of device or so (at least that’s the goal/idea)

Ah I see, so your plugin exposes the methods to create devices with features to the managed side. That’s a great way of doing things … building those native plugins every time you update is a pain.
Well with issues I mean I managed to create devices on pure unmanaged side but wasn’t able to manipulate them in any usefull way.

Yes exactly!
Let me know if you want to test the plugin :slight_smile: