Prevent Client from running ahead of server

Is there a way to prevent the client from running on ticks ahead of the server? I’ve set the MaxPredictAheadMS to 33, but the client still runs on ~12 different server ticks than the server @ 200ms simulated ping

Overall, I am trying to achieve a Moba style game. I want to allow 2 frames of predicted movement. Anything more, I cannot hide and I want to place the burden of latency back on the client.

Overall, this is how I want to imagine my scenario:

  • Client has 200 RTT
  • Client moves at Server tick 200
  • Client determines 200 RTT is 12 ticks behind the server
  • Client does not react to input until tick 210 (Giving 33.2 ms latency forgiveness that can be hidden later)
  • Server receives client input at Server tick 212
  • It starts executing immediately, being desync’d by 2 server ticks

However, I find myself struggling tremendously with all the bells-and-whistles that Netcode adds, such as:

  • Client is actually running ahead of the server
  • Predictive movement recalculates the position of each tick, so the client prediction becomes based on when the server received the command, not when the client was allowed to move
  • Predictive movement (based on slightly desync’d server snapshots) can reach the destination of movement a couple ticks before the client does, which disables the movement and leaves the actual client position short of the destination.

I find myself wanting to disable everything Netcode related except connection logic and ServerTicks structure.

We don’t have anything ready to allow this behaviour out of the box.
The only way would be make the NetworkTimeSystem think the current RTT is 0 (or 33ms), and in that case only a couple for prediction frames are performed.
But then we need to queue the input to be sent to the server by adding this “extra” input delay, so the embedded tick in the future and modify the way the inputs are returned from the input buffer (or you can handle that yourself) by skipping them until the tick match.

Another alternative, is to just have all entities as interpolated (and not using AutoCommandTarget). In that case there is 0 prediction locally, the input will be still sent for to be executed at the right time on the server. However the client will experience all the lag (plus the additional interpolation delay of two ticks)