Support for custom input solutions?

Hi,

Very excited for the new Input system and I’m hoping to finally be able to use it moving forward.

Currently I want to know if the following is something I can add custom support for. I currently receive touch input in the form of TUIO and wanted to know if it’s possible for me to use these inputs to trigger the unity events for things like UI clicks.

Any feedback / help is greatly appreciated.
Thanks,
Rob.

You’re free to make up any input with the API. Be that touch or whatever other device input.

// Add a touchscreen somewhere in your init code.
m_Touchscreen = InputSystem.AddDevice<Touchscreen>();

// Feed it touches in your update code. Note that if it's not
// done from InputSystem.onBeforeUpdate, the input will arrive one
// frame late.
// NOTE: Touch IDs *must* be non-zero. Touch must have unique ID for its duration.
// NOTE: Every touch must go through a Began...Ended/Canceled cycle. Stationary is only
//             used by EnhancedTouch ATM and is set automatically.
// NOTE: Deltas are computed automatically if not set.
InputSystem.QueueStateEvent(m_Touchscreen, new TouchState {
    touchId = 1, phase = TouchPhase.Began, position = new Vector2(123, 234) });

This is a fantastic response! Thank you so much this looks a lot easier than I was expecting. Thanks for your help.

Hi again.

I’ve been implementing this. It looks like QueueStateEvent comes under the InputSystem or at least that’s the only place I’ve been able to find it. Is that correct?

Secondly I can’t seem to get my touches to say click a UI button. Not sure If I’m missing anything.

I’ve cloned the DefaultInputAction file am I’m instansiating that within my code.

I’ve tried the following setup:

_unityInputActions = new UnityInputActions();
_unityInputActions.UI.Click.performed += context => Debug.Log("UI Click");
_button.onClick.AddListener(() => Debug.Log("Button Click"));

The Debug Log Click works but the actual button binding below doesn’t fire so it doesn’t appear it’s actually hitting the button.

Note: I am Enabling / Disabling the InputAction in OnEnable and OnDisable.

Once again any support is appreciated.

Ah doh, yes. Corrected it in the original post.

There’s two things required to make the connection to uGUI work. You need to have an InputSystemUIInputModule instead of StandaloneInputModule and the module needs to have a connection to your actions.

To replace StandaloneInputModule from the legacy input system with InputSystemUIInputModule, go to the EventSystem object that Unity automatically adds and in the inspector for StandaloneInputModule, click the “Replace with InputSystemUIInputModule” button.

To connect InputSystemUIInputModule to your actions, either simply drag your .inputactions asset onto the “Actions Asset” property in the inspector or assign the actions manually in code.

_unityInputActions = new UnityInputActions();

var inputModule = (InputSystemUIInputModule)EventSystem.current.currentInputModule;
inputModule.leftClick = InputActionReference.Create(_unityInputActions.UI.Click);

Hi Rene,

I’ve had time to look at this again and I will layout my setup to give a complete picture.
On your last post you mentioned linking the ActionsAsset to the eventsystem and I have that as shown below but I’m still not able to get UI interactions. NOTE: I can get the button to click using the mouse just not the touch. I still however get the callback from
_unityInputActions.UI.Click.performed for touch.

TouchSystem.cs (Where I manage TUIO Input)

public class TouchSystem
{
    private Touchscreen _touchscreen;

    public TouchSystem()
    {
        _touchscreen = InputSystem.AddDevice<Touchscreen>();
    }

    private void OnTouchBegin(TouchPoint touchPoint)
    {
        // Called on new touch from the hardware

        var touchState = new TouchState {touchId = touchPoint.Id, phase = TouchPhase.Began, position = new Vector3(localPosition.x, localPosition.y)};
        InputSystem.QueueStateEvent(_touchscreen, touchState);
    }

    private void OnTouchUpdated(TouchPoint touchPoint)
    {
        var touchState = new TouchState {touchId = touchPoint.Id, phase = TouchPhase.Moved, position = new Vector3(localPosition.x, localPosition.y)};
        InputSystem.QueueStateEvent(_touchscreen, touchState);
    }

    private void OnTouchEnd(TouchPoint touchPoint)
    {
        var touchState = new TouchState {touchId = touchPoint.Id, phase = TouchPhase.Ended, position = new Vector3(localPosition.x, localPosition.y)};
        InputSystem.QueueStateEvent(_touchscreen, touchState);
    }
}

TouchTest.cs (Testing Input)

public class TouchTest : MonoBehaviour
{
    // This is a direct clone of the DefaultInputActions provided by Unity
    // I've cloned this as you can't access DefaultInputActions to create a new instance
    private UnityInputActions _unityInputActions;

    private void Awake()
    {
        _unityInputActions = new UnityInputActions();
        _unityInputActions.UI.Click.performed += context => Debug.Log("Click");
        _button.onClick.AddListener(() => Debug.Log("Button Click"));
    }

    private void OnEnable()
    {
        _unityInputActions.Enable();
    }

    private void OnDisable()
    {
        _unityInputActions.Disable();
    }
}

I have an EventSystem setup as show here

Back to trying this again.

var touchState = new TouchState {touchId = wallPoint.Id, phase = TouchPhase.Began, position = new Vector3(wallPoint.X, wallPoint.Y)};
InputSystem.QueueStateEvent(_touchscreen, touchState);

Now I know this is being queued because I subscribe to InputSystem.OnEvent and can see the response.

            InputSystem.onEvent += (ptr, device) =>
            {
                if (device == _touchscreen)
                    Debug.Log($"OnEvent: {device.name}");
            };

However I get nothing wrong Unity in terms of the touches actually being sent through to UI or any action mappings.

Any help is greatly appreciated, spent a day banging my head against this and reading the documentation hasn’t helped.

For debugging, can suggest some steps.

While this is running and your code is pumping input, pop open the input debugger (Window >> Analysis >> Input Debugger). Then

  • Double-click the Touchscreen device your code added and observe the event trace in the bottom section as well as the control tree in the middle section. First thing to make sure is that the at the level of the device, everything works as intended. Actions can be debugged in a separate step. You’d want to verify that a) the expected events are coming in as per the trace and b) the controls in the tree view change state accordingly. If that works, next
  • Locate the UI actions in the debugger tree view and make sure that underneath them, the controls from your virtual touchscreen appear under the respective actions. If that is not the case, there’s a problem at the binding level.

Thanks for the response,

I’ve checked the debugger and with some fixes I’m now getting the events and the events begin and update and end are working.

I have a question reguarding point 2.

As I’ve implemented this as a TouchScreen using InputSystem.AddDevice();
Would this trigger the Primary Touch / Tap?

My understanding is we’re creating a TouchScreen and manually providing the events rather than Unity receiving them through a mobile touch screen. Below is the bindings I’m using.

Thanks for the help.

6846419--797222--upload_2021-2-17_11-31-55.png

Hi, sorry for “necroposting” but there is not much information about TUIO on Unity. I was using TouchScript for a while, but I was wondering if @TheRobWatling managed to get a fully multitouch TUIO system working with the new input system (including not only touches but also objects). Do you have any tips?

With TouchScript it was as easy as adding the script and setting the port, but then I had to add many tweaks and it was not not very efficient (it is also not compatible with the new input system)

Thanks

I’m afraid I never got it working with the new input system and ended up writing my own setup to cover what I needed. Would have been great to get it working.