I have recently started to build a game that includes main menu with several buttons.
Buttons used are TextMeshPro.
I have used old input system and everything worked fine, but now that I am trying to add scene with moving characters, I have decided to change to new input system.
Upon trying everything and following several youtube videos, step-by-step, new input system seemingly didn’t want to work on my project. But then after testing a bit more, I figured:
it will NOT work when I am running project in “Game” tab
it will work when I have “Maximize On Play” selected, it will work in “Simulator tab” and when I “Build And Run” entire project.
I am attaching video for demonstration:
fd1mnb
I have EventSystem as part of my scene, with Input System UI Input Module (Script) that has DefaultInputActions Action Asset selected. This was added automatically when I upgraded from old input system.
This issue that I am having is making it very inconvenient to test stuff, since I always need to run things in maximized mode (takes longer to run) or have to build and run entire scene.
The only solution I have managed to find for time being is reverting UI scenes back to old system and using new system on other scenes, thus having both Input systems selected as Active Input Handling.
When I installed new Input System package I also restarted PC just to see if maybe devices weren’t recognized properly.
Input system version: 1.0.2
Unity version: 2019.4.18f1 Personal
Windows 10 64bit
I believe there is a bug? in the InputSystem when calculating Screen Coordinates, due to changes in Unity’s own API changes. I encountered similar trouble when trying to get the right screen size.
My Fix:
public static class ScreenHelper
{
public static Vector2 GetScreenSize()
{
#if !UNITY_EDITOR
return new Vector2(Screen.width, Screen.height);
#else
var asm = typeof(UnityEditor.Editor).Assembly;
var type = asm.GetType("UnityEditor.GameView");
UnityEditor.EditorWindow gameView = UnityEditor.EditorWindow.GetWindow(type);
var targetResProp = type.GetMethod("GetSizeOfMainGameView", System.Reflection.BindingFlags.Static | System.Reflection.BindingFlags.NonPublic);
Vector2 targetRes = (Vector2)targetResProp.Invoke(null, null);
return targetRes;
#endif
}
}
I meant that the input system for calculating screen coordinates. I use the code for calculating when not on “Maximize On Play” and on multi-display for resizing UI canvas size and calculating camera viewport when during runtime. It might have nothing to do with the input system but using Screen.width and Screen.height blindly returns the incorrect screen sizes in the Unity Editor.
The Screen.width and Screen.height wrapper returns the correct value assuming if you only have a single display, which is applicable for the actual build. But if one is developing in a multiple display environment, it returns weird values depending on which display the Editor is on and if the GameView is in a separate display.
you omitted this from your initial post, I had no way of knowing that you have multiple displays
Unity is very bad with multiple displays, in general it is advised not to use it on multiple displays (but you should submit a bug if you think there is one!)
since I have a single display I will leave to someone else to validate your claim about strange numbers
It has been an annoying issue for developers for a long time and has confused even more developers with the introduction of Screen.resolution. It still confuses the hell out of developers as to which is the actual screen size to be used for calculation. especially if you are doing your own raycasting for UI in 3D space.
First of all, I would like to thank everyone who decided to help.
So after few days of trying to solve this issue, I basically created new project and went step by step, replacing files one by one. I wanted to see what is the difference between working project (from Unity tutorials) and mine.
In the end it turned out to be what I think is a BUG with Device Simulator!!!
What I discovered so far:
When we run our project in GAME mode - NOT with Maximize On Play:
[Device Simulator floating] - UI NOT clickable
[Device Simulator docked/tabed] - UI NOT clickable
[Device Simulator closed] - UI is clickable.
Demonstration video 1:
wvpaze
When we run our project in GAME mode - WITH Maximize On Play:
[Device Simulator floating] - UI NOT clickable
[Device Simulator docked/tabed] - UI is clickable (on not maximized it was not!)
[Device Simulator closed] - UI is clickable.
Demonstration video 2:
mmk0n5
Best Solution:
Don’t have Device Simulator opened at all.
Select between Game and Simulator modes with dropdown menu on side.
As some of you noted above, yes, I do work on dual screen setup! I guess most people nowadays do!
I still think this shouldn’t be so bugged, but I don’t really have that much experience either.
Do you think I should submit bug report from within Unity IDE?
i have this bug too
Unfortunately your solution doesn’t work for me.
I use New Input System and Touches in my game and using mouse in Game Tab does not work. Although my Simulator Tab is closed.
All touches work on smartphones, but i need work with mouse in Game Tab becouse Unity Recorder work only in GameTab and doesn’t work in Symulator Tab
It’s horrible. How can I solve this problem?
I guess the bug has some relation with Window > Analysis > Input Debugger > options > Simulate Touch input from mouse or pen
The input issue started when i tried this option. And weird thing is the touch input getting registered on non game/simulator Tabs with returning NAN values in touch position.
Bumping this one here because it is 2025 and there still appears to be a bug with the simulator window when using the New Input System.
I am on a Macbook Pro and the touchpad scroll will not fire when the simulator window is in focus. It works perfectly fine when using the game view, but simulator won’t pick up the scroll input.