Retrieving most accurate frame-rate independent event time for mouse and keyboard input

As the title says, I’m trying to work out how to get the frame-rate independent time of a mouse/keyboard input. I realise there’s no such thing as an ‘exact’ time, but it would be great to get times which are not bound by framerate.

I have a test running at the moment which handles InputSystem.onEvent and looks at these two fields:
eventPtr.time (i.e. the InputEventPtr)
InputState.currentTime

These are slightly different. eventPtr.time seems to be locked to the frame rate (presumably due to updating via Update). InputState.currentTime provides a slightly different time, and seems to have higher resolution? However, I’m not entirely sure when it’s receiving that time, or if it’s actually a more accurate representation of when the input is triggered.

I haven’t come across anything else that might provide an accurate event time, but might be missing something?

Use case: I’m developing cognitive psychology tasks that measure response time, so accurate timing data is really important. Ideally, I’m hoping for a solution that will work across desktops, WebGL, and Android, though WebGL is currently most important. Performance also isn’t guaranteed: some machines will likely run at significantly less than 60fps.

Thanks in advance for any assistance or clarification!

Hi @MGretton . I don’t know that we have a good solution for this kind of task unfortunately. For example, the timestamp on each input event (eventPtr.time) doesn’t represent the time that the event happened on the physical device, nor the time the device driver processed the event, nor the time the app message pump received the event. It’s timestamped using the performance counter at the time we get around to processing the event in low-level Unity code, but that’s just on Windows. Other OSes have different quirks in their timestamping. Even if we were to change our implementation though, most OSes (certainly Windows anyway) don’t provide physical device or device driver timestamps, so we’d always be slightly removed from ‘exact’ time, as you say.

I know that lots of developers have success measuring this kind of latency using high-speed cameras and counting frames between something appearing on screen and the user hitting an input, but I’m afraid that’s all I have to offer in this case. Sorry.

Thanks @andrew_oc , for the advice. Yep, there’s always going to be timing issues just from monitor and input/driver latencies (I have been measuring latency using a photodiode/high-speed camera setup, like you said, and fortunately LCD monitors seem to have become significantly lower-latency over the last few years). In the past we’ve used dedicated input hardware and CRT displays to minimise timing issues, but covid has increased the need for online tasks we have a lot less control over.

The kinds of ‘gamified’ tasks we use Unity for don’t need millisecond-level accuracy, but finding ways to minimise latency and variability between receiving input and recording it is still important. I’ll keep looking around for solutions, but it might just be something we have to take into account when analysing the results.