Virtual mouse not acting as pointer

I implemented a Virtual Mouse using THIS tutorial on YouTube.

It is a URP project using Unity 2021.3.5f1 .

It is a terrain editing application with an interactive UI menu for selecting options.

While using the mouse, all is well. But when I switch to the controller, that is where it goes wrong.

In the editor game window, it will switch and work as intended.
But when I make a build to test it, the mouse and controller pointer do not flow together but is reset to bottom of screen 0.0 and mouse retains original position.

Secondly, and more importantly, controller pointer is not registering as a pointer.

(Tested with “EventSystem.current.IsPointerOverGameObject()” )

The raycast to terrain for editing works fine still though.

Any insight or help in solving this would be verry helpfull.

Sounds like you have a bug.

“goes wrong” won’t be a useful observation on its own.

Think more in terms of this series of questions:

  • what you want
  • what you tried
  • what you expected to happen
  • what actually happened, especially any errors you see
  • links to documentation you used to cross-check your work (CRITICAL!!!)

You must find a way to get the information you need in order to reason about what the problem is.

What is often happening in these cases is one of the following:

  • the code you think is executing is not actually executing at all
  • the code is executing far EARLIER or LATER than you think
  • the code is executing far LESS OFTEN than you think
  • the code is executing far MORE OFTEN than you think
  • the code is executing on another GameObject than you think it is
  • you’re getting an error or warning and you haven’t noticed it in the console window

To help gain more insight into your problem, I recommend liberally sprinkling Debug.Log() statements through your code to display information in realtime.

Doing this should help you answer these types of questions:

  • is this code even running? which parts are running? how often does it run? what order does it run in?
  • what are the values of the variables involved? Are they initialized? Are the values reasonable?
  • are you meeting ALL the requirements to receive callbacks such as triggers / colliders (review the documentation)

Knowing this information will help you reason about the behavior you are seeing.

You can also supply a second argument to Debug.Log() and when you click the message, it will highlight the object in scene, such as Debug.Log("Problem!",this);

If your problem would benefit from in-scene or in-game visualization, Debug.DrawRay() or Debug.DrawLine() can help you visualize things like rays (used in raycasting) or distances.

You can also call Debug.Break() to pause the Editor when certain interesting pieces of code run, and then study the scene manually, looking for all the parts, where they are, what scripts are on them, etc.

You can also call GameObject.CreatePrimitive() to emplace debug-marker-ish objects in the scene at runtime.

You could also just display various important quantities in UI Text elements to watch them change as you play the game.

If you are running a mobile device you can also view the console output. Google for how on your particular mobile target, such as this answer or iOS: https://discussions.unity.com/t/700551 or this answer for Android: https://discussions.unity.com/t/699654

Another useful approach is to temporarily strip out everything besides what is necessary to prove your issue. This can simplify and isolate compounding effects of other items in your scene or prefab.

Here’s an example of putting in a laser-focused Debug.Log() and how that can save you a TON of time wallowing around speculating what might be going wrong:

https://discussions.unity.com/t/839300/3

When in doubt, print it out!™

Note: the print() function is an alias for Debug.Log() provided by the MonoBehaviour class.

Yes, I understand that “goes wrong” is not the best way to describe a fault or bug. But I do not know what the actual cause or bug is. Only what is visible on screen.

As such, I made a short video to demonstrate the matter ( with temporary on screen icons ) .

I have TOOO many debug.log’s all over for testing and verification while developing which will be removed when no longer needed as I know it is the easiest way to find an issue. I am only a hobbyist, so no doubt could do it better or missing obvious stuff on occasion.

Currently no errors are coming up in the console.

An added bit of info.
Sometimes in the build, when I load a level ( which is also a scene reload ) while the controller has focus, it works… right up till I re-centre the mouse and controller, or touch the mouse. Then it is broken again.

When you move with the controller, are you properly telling the StandaloneInputSystem about the new mouse position as a result of the controller move? You have to simulate it all the way up and down the stack otherwise it still thinks the mouse is where it was before.

This is an example of overriding the StandardInputModule and jamming in your own inputs.

using UnityEngine;
using UnityEngine.EventSystems;

// @kurtdekker - example of enhancing the input module with alternate mouse input

public class TestInputModule : StandaloneInputModule
{
    // this bangs on the input module
    public void ClickAt(float x, float y)
    {
        bool b, bb;

        Input.simulateMouseWithTouches = true;
        var pointerData = GetTouchPointerEventData(new Touch()
            {
                position = new Vector2(x, y),
            }, out b, out bb);

        ProcessTouchPress(pointerData, true, true);
    }

    // hook this into the button so you can confirm it gets clicked
    public void ButtonCallback()
    {
        Debug.Log( "Clicked!");
    }

    void Update()
    {
        // press C to click center of screen; if a button is there it will fire.
        if(Input.GetKeyDown(KeyCode.C))
        {
            ClickAt(Screen.width / 2, Screen.height / 2);
        }
    }
}

I haven’t done a lot with it beyond verifying the above worked as expected.

@Kurt-Dekker Thank you for the info.
I am using the new input system. As such it has touch as part of it’s inputs.

My PC has a touch screen, so I tested it out ( as it has not been turned on in a while — no need of it ).
The touch worked like the mouse and registered as a mouse. Controller still not working as intended.

In my project, I have a scene that was for testing my base setup. As such it is still there and uses the same scripts. That scene if exported to a build works?!?

So I then tried script orders, turning off scripts and even trying to lay out the elements in the scene hierarchy to match as close as it is in the test scene ( yes probably pointless).

So I am scratching my head as to why one works and the other does not… even though all scripts are the same ones.