If the input is not inside the command buffer, the input may have been discarded.
The indicator that is worring me is the command age == 0. It should be close to -2560 (-10 * 256, being in fixed point). Instead if 0. It may be just a bug in the visualtion though.
I didn’t know about the ticket, so we will raise the priority for this.
It seems like it’s actually the client that hasn’t produced a command for the tick. Testing dropping tick rate to 28 and dropping down to 25 fps for less than half a second causes the issues as well. Which makes it look like the setup is very prone to frame rate drops.
if the client is running slow, it will definitively not produce the input for tick X. The delta time increment cause for example to be at tick 100 and then tick 102, meaning that the currently processed input is for Tick 102, not for Tick 101. The 101 tick (also for the client) is still the old value he was able to process for tick 100.
So, in the input buffer the client would looks like … [100][102] … Same goes for the server (if he didn’t discard). In that case he also will see [100][102].
When we invoke GetInputAtTick(X) either the matching tick is returned, or the latest input (in this case 100) is going to be used to simulate Tick 101. Both server and client does the same.
Yeah you’re on point. It should not really be an issue since it’s handled the same on both sides. This was a red herring to the “missprediction” issue I was looking for here .