I need to have a function execute at regular intervals, in this case every 50 msec. In my update loop I check to see if 50 msec have elapsed since the last function call ended, and if it has, call the function again. Since the function is pretty simple (a few lines of code) and it’s running at over 600 fps, I would think this should work. But it doesn’t.
What happens is my function executes at uneven intervals, between about 52 and 56 msec. I could live with it not being exactly 50 msec, but the big problem is that it’s so irregular. Variations of 4-5 msec each time the function executes accumulate over time, so after a while the timing gets pretty far off. Which means I can’t just round the numbers and “fake” it, since the data I’m generating need to match what’s going on in the scene.
Is there a better way to force a function to execute at regular intervals? Ideally, I’d like to do an interrupt-driven function, but there is no such thing in Unity. Is there? Any suggestions?
FWIW, this is a simulation of a physics experiment. Since students will be taking measurements with the data it generates, the data need to be accurate or the students will get bad results. It’s also important that the data generated match what students see happening with the simulated apparatus on the screen. I’m NOT using Unity’s physics, so there’s none of that overhead to slow things down or interfere with what my code is doing.
There’s no way to get anything to run at a precise time; the best you can get is the resolution of the frame rate, which will of course vary. So, you need to test if elapsedTime > interval (passed point of data capture), or elapsedTime + Time.deltaTime > interval (will be passed point of data capture by next frame); then calculate the value to record as it should be at interval (i.e. calculate what the value was / will be at the time) and record that. Then update interval to the next recording time but be sure not to include any delta past / ahead of interval in the updated value.
But InvokeRepeating looks like what I want, a function that is set up to be called based on a timed-interrupt cycle. We’ll see.
I still don’t understand how what I was doing was not working. My frame rate (< 2 ms/frame) is FAR higher than it needs to be to get the resolution I need (50 ms). I can see no reason why it wouldn’t execute on time to within ± 1 ms.
How are you measuring the frame rate? If it’s with the stats panel, I think that may be somewhat misleading (I can’t provide a link, but I seem to remember someone mentioning that the ms-per-frame and FPS values displayed in the stats panel are related only to rendering, and not to how long it takes to complete a single update cycle).
Also, note that the precision of Time.time will decrease the longer the simulation runs. However, you’d most likely have to let the simulation run for some time before you started to see inaccuracies related to precision.
As laurie said though, there’s most likely going to be a limit to how precise your results will be. First, I’d recommend taking AkilaeTribe’s suggestion and structuring your code so that even if each update isn’t exactly in sync, the overall simulation stays more or less in sync over time (that is, be sure to cancel out any accumulated error each update). I’d also recommend using Time.deltaTime and keeping track of the elapsed milliseconds using a integer, as that will give you consistent precision regardless of how long the simulation has been running (assuming that Unity computes the value of Time.deltaTime in a sensible way, that is).
I guess maybe that frame rate shown in the Stats panel is not so accurate after all. Even with InvokeRepeating, the best I can get is 56 ms resolution, no matter how small I set the repeat rate. It’s much more reliable this way, but still not 100%. Most of the time it’s 56 ms between data points, but not always.
It’s not that the simulation stays in sync, per se. If it’s slightly off, no one will ever know. It’s the data generated and displayed that needs to be exact. I was generating the data based on what was happening in the scene in real time.
Looks like it’s time for plan B. That is, since they’re time-based calculations, I can generate the data ahead of time. I only need 5 sec worth at a time, and generating a whole 5 sec worth on the fly produces no noticeable delay. Then I just plot one data point at a time while “recording” data, and the graph will always be accurate. I’ll put off dealing with the timing issue till I need to interact with the scene during data collection. This particular one is just set it up, let it run, and see what happens.