The ui input works only on display 1 on the editor.

I’m making a project using multiple displays.
In particular, ui input only works on display 1 only in editor.
When built, it works on other displays.
However, during the development of the editor, there are many difficulties in how to build every time. How can I handle this?

I found a solution.
The graphic raycaster of the uicanvas on display 1 was handled by turning it on and off depending on the mouse position.

Below is a function that finds the mouse position.

using System.Runtime.InteropServices;

        /// <summary>
	/// Raycast into the screen underneath the touch and update its 'current' value.
	/// </summary>
	[DllImport("user32.dll")]
	[return: MarshalAs(UnmanagedType.Bool)]
	private static extern bool GetCursorPos(out MousePosition lpMousePosition);

	[StructLayout(LayoutKind.Sequential)]
	public struct MousePosition
	{
		public int x;
		public int y;

		public override string ToString()
		{
			return "[" + x + ", " + y + "]";
		}
	}
	MousePosition mPos;