Which platform? On Windows, OS-level mouse simulation from touch can throw things off (though I vaguely remember some fix in this area to our native code).
Yes on Windows, using a Microsoft Surface Device, not the touch simulator.
I remember Pre 1.0 that there was an issue with it, that it was fixed, but seem that the duplication/disambiguation of the input device issue is back
I assume on Android/IOS it will work without an issue since there won’t be a conflict between Keyboard/Mouse and Touch
Windows 10
Unity 2021.1.16f1
InputSystem 1.1.0-Pre.6
Yeah, surprised about that too. Also remember that there was some fix related to this.
When you pop open the input debugger and open both the touch and the mouse device and then touch the screen, is there parallel input on both the mouse and touch?
Yes there is… the ID is different, but both Touch and Mouse receive input
I’ve also noted, and not sure it’s related, that if I look a OnControlsChange, I will see Touch being activated then Mouse right after if auto-switch is on.
public void OnControlsChanged(PlayerInput playerInput)
{
if (playerInput.GetDevice<Touchscreen>() != null) // Note that Touchscreen is also a Pointer so check this first.
m_ControlStyle = ControlStyle.Touch;
else if (playerInput.GetDevice<Pointer>() != null)
m_ControlStyle = ControlStyle.KeyboardMouse;
else if (playerInput.GetDevice<Gamepad>() != null || playerInput.GetDevice<Joystick>() != null)
m_ControlStyle = ControlStyle.GamepadJoystick;
else
Debug.LogError("Control scheme not recognized: " + playerInput.currentControlScheme);
Debug.Log("Control scheme: " + playerInput.currentControlScheme);
}
Output will be after a TOUCH
Control Scheme: Touch
Control Scheme: Mouse
This hasn’t been resolved yet for me, my game works fine on android but works badly with touch on windows.
UI interaction just doesn’t work at all on my surface. Touch on windows is a very important selling point for me. It’s weird this doesn’t work correctly when I’m sure windows has a well established API for this.