Reproduce legacy Input.GetAxisRaw("Mouse X") and Y

As the title suggests, I’m trying to reproduce the legacy system of Input.GetAxisRaw(“Mouse X”) and Y. I looked in the documentation, and it suggested the old Mouse Axis was calculated by taking the mouse Delta and multiplying by a Sensitivity (which I can see in the old Input manager).

When I try this with the New Input system, it generally works and gives me an equivalent of the old Mouse X and Mouse Y Axis, but it seems significantly less precise. To double check, I looked at the actual values it produces versus the old method, and it does generate significantly different values, even if I use the same Sensitivity (from the old Input Manager).

The net result is that mouse input, in this case for a FPS, is pretty subpar compared to the old input system. Because it’s less precise, each minimal movement of the mouse makes the camera move further. Sure, you can lower mouse sensitivity to compensate and make ever smaller gaps, but on the same “reasonable” sensitivity, it’s far less precise (hopefully that makes sense, in the same “sensitivity” for example to do a 180 in 3 inches of horizontal mouse movement, it’s far less precise).

Any idea what I’m doing wrong or why this would be?

Alternatively: how ARE we supposed to perform Mouse Look under the new Input system? I assumed I was doing the intended use case by using mouse Delta, but if that’s not the way it’s meant to be, please let me know what the new method should be!

I notice fps drop when I do the following

myControls.gameplay.look.performed +=
        context => look = context.ReadValue<Vector2>();

when reading data out of a gamepad. It seems that the performed event is called irregularly as the stick is in use causing jitter in the final control and fps drop. I’ll post a thread shortly. I think the cases are related.