Old Input System, how can I force touch/clicks via code?

Hello all, hope you’re having a lovely day. :grinning:

Question about the old input system which we are still using.
So what I am attempting to do is to create an auto-player for our game. I could go through the game simply activating the onClick element of buttons, etc. But what I really want is to kick-off input via the InputSystem. This way I can auto-play the game fully as a player would. So if a UI element is blocking whatever I want to touch, I am made aware of it.

Now, I am aware I can utilise the event system to achieve the same thing, mostly. However, we have popups in the game which just require you to tap anywhere on the screen to acknowledge them so they disappear. During these popups there is no btn to press, so to speak. So the EventSystem is unable to be utilised to circumvent that. There are other examples where Input.GetMouseButtonDown is utilised as well

With this auto-player I essentially want both it and the game itself to have no knowledge of one another. So ideally I’d prefer I didn’t have to explicitly tell these systems how to advance.

Thus, what I’m wondering. Is there a way to override UnityEngine.Input.GetMouseButtonDown(0) and UnityEngine.Input.touches to return behaviour I want? ie. “Yes, pressed this frame” or “Released this frame”
I can see that both GetMouseButtonDown and touches are using external calls, so there doesn’t appear to be any location where I can inject overrides.

I did, however, notice that the StandaloneInputModule supplied with EventSystem in the hierarchy have virtual calls for touches and PointEventData. But even after replacing the EventSystem with my own version and setting/overriding the following methods. I have not been able to achieve my goal. This would appear to be because the Input.GetMouseButtonDown has no interaction with these.

GetTouchPointerEventData
GetMousePointerEventData
GetPointerData
ProcessTouchPress
ProcessMousePress

Shorthand:

  • I require the ability to override Input.GetMouseButtonDown to progress through areas of the game which are seeking that from the player.
  • If this happens to work with the event system out of the box, great. If not, I can set up the event system to tap on buttons wherever else I need it.

Thanks in advance. :slightly_smiling_face:

1 Like

The old input system is a closed box, so no you can’t change how it works.

Nonetheless doing this through inputs isn’t how you should be doing it. You simply need to put a layer between inputs, what the inputs interface with, and what the inputs do. Then your auto-player simply interfaces with this interface middle layer.

So normally it’d be Inputs -> Interface -> Actions.

Then the auto-player is simply Auto-player -> Interface -> Actions.

1 Like

I figured this would be the case. I was hoping to avoid having a layer in between because we were hoping to utilise the autoplayer across projects and it feels a little invasive to ask projects to swap out code for an autoplayer variant.

Nonetheless, that was the confirmation that I needed to hear that trying to override Input in the old system would not work. Thank you.

1 Like

The interface layer can definitely be made generic and reusable across projects.

That said both inputs and the auto-player are going to need context about the game. Namely the former needs context to decide whether certain inputs are actionable, and the auto-player will need context to know what decisions to make.

You can probably still make a reusable framework for an auto-player, but it would be something individual projects hook into/use to handle their specifics.

1 Like