I’m using an orthographic camera to determine at what x and y my mouse is at in units. My pixels to unit setting is at 100. The size of my camera is set to Screen.height / ( 2 * pixelsPerUnit).
To be able to accurately track mouse movement, I need to Camera.ScreenToWorldPosition(Input.mousePosition) to provide values that are in order of 10^-2 (such as 1.23), instead, the function gives values that have 1 decimal point (such as 1.2).
Is there a way to increase the accuracy of the function?
How are you personally reading the mouse position data? For instance, if you’re simply using something like Debug.Log(myVector) then the default decimal length is only a single decimal value. However, arguments passed through ToString() (and, subsequently, the conversion process in Debug.Log() or print()) can increase the precision of the values displayed.
If this is, indeed, simply a dilemma with data presented to you, then
Debug.Log(myVector.ToString("F5"));
may be an adequate solution.
To be clear, “F0”, “F1”, “F2”, “F3”, “F4”, etc. all display that number of decimals after a number when using ToString().