Create two virtual joysticks (touch) with the new input system

Hi,

I really like this new input system, but I’m having a hard time trying to actually know it.

So, as the title says, I’m trying to create two virtual joysticks (one for aim, one for move).

I have noticed that there’s a script called On-Screen Stick, but looking inside I don’t know how to send an event to an specific Controller/Action. Also, the output seems to be a little laggy.
I’m selecting Touchscreen/Position, not sure if it’s correct.5625337--583726--Captura.PNG

So, I guess my questions are:

  • Is it possible to configure an specific Controller (like MyLeftVirtualJoystick) inside InputActions (or wherever) and send events to it whenever I want?
  • How to properly use On-Screen Stick to make it less laggy and with two joysticks?

Thanks!

So this is what I did
First created Input Action were Player is mapped to Move(value, vector2) with a binding Left Stick [Gamepad]

and this in player controller script

private Vector2 prevInput;
private Controls controls;

private void Awake()
{
    controls = new Controls();
    controls.Player.Move.performed += ctx => SetMove(ctx.ReadValue<Vector2>());
    controls.Player.Move.canceled += ctx => ResetMove();
}

private void OnEnable() => controls.Enable();
private void OnDisable() => controls.Disable();

private void Update() => Move();

private void SetMove(Vector2 moveInput) => prevInput = moveInput;
private void ResetMove() => prevInput = new Vector2(0f, 0f);

private void Move()
{
   Vector3 r = transform.right, f = transform.forward;
    r.y = 0;
    f.y = 0;
    transform.position = transform.position + (r.normalized * prevInput.x + f.normalized * prevInput.y) * 2 * Time.deltaTime;
}

Here in awake we are creating new controls object and adding SetMove() and ResetMove() to performed and canceled events because those are raised only when some changes occur. So if you don’t use those and were to hold your stick at a particular position you would not keep moving in that direction.

So simply setting and resetting prevInput and updating the transform position according to prevInput.

Similarly you can make a right stick and add a mapping to rotate the camera.

And for On screen stick set control path to Left Stick [Gamepad]

I learned about new input system a few days back here.

2 Likes

Setting up joysticks (or any device) with specific roles is possible and can be used inside the bindings for actions as well as with OnScreenControls.

Any device has an arbitrary set of “usages” that can be applied to it. These are just string tags that can be applied dynamically at runtime using InputSystem.SetDeviceUsage or InputSystem.AddDeviceUsage.

One way to exploit these usages is with the path strings. You can drop a binding into text mode with the little T button and then type stuff like “{Left}/trigger” (binds to the “trigger” control of the device of type Joystick which is tagged with the “Left” usage).

However, a far more elegant way is to tell the system about the usages you intend to use. If you do this, this will show up in the control picker as well. You can apply this to any existing layout in the system through what’s called a “layout override”. Check out the “Custom Device Usages” sample for a working sample of this.

// Let's say you want to have two joysticks. One tagged
// with "Left" and one with "Right".
InputSystem.RegisterLayoutOverride(@"
    {
        ""name"" : ""JoystickWithUsageTags"",
        ""extend"" : ""Joystick"",
        ""commonUsages"" : [
            ""Left"", ""Right""
        ]
    }
");

This override causes the settings you apply to directly write over the built-in Joystick layout. With this done (somewhere in the startup sequence), you will see the following in the control picker.

5638228--585769--upload_2020-3-27_18-38-24.png

Now you can bind specifically to the left and the right joystick individually. This works in .inputactions files but also wherever else the control picker is used. I.e. also with OnScreenControls.

Also, this allows setting up control schemes that specifically require a Left and a Right joystick, for example.

There is one more step missing, though. At runtime, you need to actually assign the usages to joysticks for the bindings to become active.

InputSystem.SetDeviceUsage(joystick1, "Left");
InputSystem.SetDeviceUsage(joystick2, "Right");

The way OnScreenControl works ATM, it will have an inherent one-frame lag. Something we should probably fix.

////EDIT: I just realized that OnScreenControls will not create the devices with these usages correctly. Looking at the code, I found an existing FIXME in there about this. ATM the code ignores usages in the paths and does not apply them to the devices. I’ll have a look at that.

1 Like

Yep, so the problem was that I was trying to match the virtual joystick with a Touchscreen/Position binding. There’s nothing wrong with the On-ScreenStick script, if I bind it to a gamepad joystick it actually works!

Thank you both!

So how do I actually use OnScreenStick? Should I add an UI Image to a canvas and assign a script to it to actually see the joystick and use it?

Is it possible to my scripts notified when user releases the stick completely (not just drags it to 0,0, but actually releases the thumb?)

And how do the on-screen buttons work? How do I add them and add visual states pressed down/released?

@datagreed Recommend having a look at the OnScreenControls sample that comes with the package. Some doc improvements for these will land in 1.1.

For OnScreenSticks, add an Image UI object and then add the OnScreenStick component to it. Set “Control Path” to a stick/Vector2 control on a device. Use the Image component to customize the appearance.

For OnScreenButtons, add a Button UI object and then add the OnScreenButton component to it. Set “Control Path” to a button control on a device. Customize the button appearance as normal.

The on-screen controls will create virtual devices. Every distinct type of device (such as “Gamepad”) reference by the controls will create one instance of a device. If multiple controls reference “Gamepad”, for example, only one Gamepad is created. Each on-screen control component feeds into the input control that it references through its path.

Simply add a component to the stick that implements IPointerUpHandler. You get a call when the user lifts the finger off the screen after a stick interaction.

@Rene-Damm thank you very much for so detailed answer! Will give it a try.

Recommend having a look at the OnScreenControls sample that comes with the package.

Could you please tell me how do I fin this sample? Is it some kind of a demoscene or something else? I’ve installed the package via package manager, how do I find this sample?

See here.

thanks!

I created my own project that utilizes the on screen joystick that is bound to the left stick gamepad. The issue is, the input is flickering a lot and not proper. If i move the on screen joystick around, the input I receive (as a vector2) keeps flickering to 0,0. I even opened the Input debugger to see what is happening to the gamepad leftstick input but that received input properly.
I even downloaded the project from here(

) and tried the ui controls. They also have the same issue (although, once, when I opened the project and tried it worked fine, but since then it is back to being flickery). Is there some issue on my end?

Do you have a gamepad connected? If so, a PS4 controller by chance? My guess is there’s a gamepad that’s spamming events and the inputs aren’t suppressed on the actions as they should. Thus causing input to flicker between input from the on-screen controls and input from the gamepad.

Currently, I am using my mouse to move the ui joystick in unity play mode and no external controller. Don’t the on screen controls use gamepad input? Atleast in the way I set it up where it is using the left stick of the game pad. So why does it flicker? Also, could you explain the part about the actions being suppressed because I am not sure what that means.

I mean, is there a controller connected to the system? The on-screen controls simply add another gamepad (or whatever they are set to) but if depending on how the actions are set up, they will source input from whatever is available. So an action may connect to both the on-screen gamepad and an actual physical gamepad that is connected.

Easiest to verify in the debugger (“Window >> Analyze >> Input Debugger” in the editor’s main menu).

No, there is no external controller. I am just using the joystick in play mode in unity6457429--723850--unityForumSS.JPG

Here is a video of what is happening:

And this is not just this project. I created a brand new project and the same thing happens there. I Debug.Log() the Vector2 input in the OnMovement function and sometimes it goes to 0,0 (flickers) when I am still using the ui joystick

I think I have found why it is behaving like this. I am using my mouse which is part of the keyboard and mouse scheme to control the ui which uses a gamepad scheme. Because of this, it fluctuates between these. I tried to run this on my phone using unity remote 5 and it seemed to work fine since there it only used the gamepad scheme

I had the same issue. I found out that Warriors Sample don’t work with touch input if I add a canvas and On Screen Stick mapped to left stick of the gamepad. I had flickering of the input. However, Rolling example from Touch Samples works fine.

I found out that if I use mapping from rolling sample (Map directly to a generated input controls class) works great, but if I use a player input component, it fails to work on a touch screen, but works fine on regular devices like keyboard, mouse and gamepad. Is there any workaround to make the player Input component work properly on touch screen? Direct mapping from Rolling sample is not that scalable and it won’t be possible to implement local multiplayer with that scheme as I understand.

P.S. I tried Input system package 1.0.1 and 1.1.0-preview2. They work the same on Unity 2019.4.9 and 2020.2.0.b12

The question that I faced right now is how to make multi touch work with the new input system?
I have 2 On Screen Sticks. One is mapped to movement and another one is mapped to the camera. How can I use them simultaneously? Where should I dig at to look for the info?
I can see touch API, where I can track fingers, but the level of On Screen Stick script is too high level. It just tracks pointer information. Should I write my own script that receives specific finger position?

If i move the on screen joystick around, the input I receive (as a vector2) keeps flickering to 0,0.

I had flickering of the input. However, Rolling example from Touch Samples works fine.

I’m also seeing the flickering of the input - it kinda works sometimes, it’s just about useable for testing but def not for giving to an end user. Again, Rolling example works great.

Is there a priority issue or something going on? It feels like I’m just missing a setting somewhere to get it working perfectly.