SimulationTickRate and NetworkTickRate in ClientServerTickRate?

my understand that:

on server:
NetworkTickRate: rate server send snapshot to client
example: 60 NetworkTickRate → send 1 snapshot per 1/60 ~ 0.016
SimulationTickRate: (i dont this value using by server?)

on client:
NetworkTickRate: (i dont this value using by client?)
SimulationTickRate: ?

anyone can explain differences these?

The Network Tick Rate dictate the frequency of the snapshot updates. Should be lower or equal the SimulationTickRate.
The Simulation Tick Rate dictate the server fixed update rate, so how frequently the server update the simulation.

i.e: The server run the run the game sim at 40hz (SimulationTickRate = 40) and send data at 10 hz (NetworkTickRate = 10).

On the client both NetworkTickRate and SimulationTickRate can be set as well but it is not necessary. The server communicates the settings when the client join the session, and they are identical (MUST BE) to the server settings.

1 Like

thank for your answer. you can explain a bit about this.
“PredictedSimulationSystemGroup runs for both the client and server worlds at a fixed time step, as specified by the SimulationTickRate setting”

I test this with this code
but OnUpdate method run more time on client side world.
9870444--1422687--Capture.PNG
9870444--1422690--Capture.PNG

unity docs said that: "Importantly: Because the client is predicting ahead of the server, all systems in this group will be updated multiple times per simulation frame, every single time the client receives a new snapshot (see NetworkTickRate and SimulationTickRate). This is called “rollback and re-simulation”

you can explain statement “all systems in this group will be updated multiple times per simulation frame, every single time the client receives a new snapshot”

Sure

Both client and server simulation are updated at are using a fixed-time (or fixed rate) loop. The difference it is in how this loop on the client in respect to the server.

Let’s start from the server. The server updates the SimulationSystemGroup (that contains all the simulation logic) every 1/SimulationTickRate milliseconds. Also, in build, forces the engine refresh rate (and so the process goes to sleep) to match the same rate as well.

On the client, the SimulationSystemGroup (and so the PredictionSystemGroup) runs every frame, because the client updates at variable frame rate (and in general != SimulationTickRate, faster or slower).

The PredictionSystemGroup still needs to run a fixed time rate somehow, in order to do the same simulation the server does. But at the same time, we also want the user have immediate feedback (in case of higher frame rate) when it comes to apply inputs to the controller entity.

In order to achieve both, the PredictionSystemGroup the client run “Full” and “Partial” simulation for a given server tick (i.e Tick 100).

A Full Tick is essentially what the server does: the PredictionSystemGroup update all the underlying systems using a delta time that is equals to 1/SimulationTickRate.

A Partial Tick is, as the world says, imply running the underlying systems that are part of the PredictionSystemGroup using a delta time that is less than the 1/SimulationTickRate.
Partial ticks require the predicted ghost to “restart” simulating from the “beginning” of a give Tick (i.e tick 100), and for each successive partial update, an increasing delta time is used.
Multiple partial ticks can occur until enough real time is passed, such that the accumulated elapsed time is > fixed time.

At that point, one Full Tick is going to be simulated (dt == 1/SimulationTickRate) and the remaing part of the elapsed time is again using simulated using a partial tick.

So as an example:

Tick 100 (full)
Tick 101.2 (partial tick, we are updating 20% of the 101 tick) dt == 1/SimulationTickRate * 0.2
Tick 101.6 (partial tick, we are updating 20% of the 101 tick) dt == 1/SimulationTickRate * 0.6
Tick 102.1

  • Full Tick 101, dt == 1/SimulationTickRate
  • Partial Tick: 102.1 dt == 1/SimulationTickRate * 0.1
1 Like

I’m sorry to bother you.
i try to understand how OnUpdate running in my TestClientSystem under PredictedSimulationSystemGroup

My code for debug full vs partial tick:
9874275--1423656--Capture.PNG

My understand about full vs partial tick:
9874275--1423659--Untitled.png
I dont understand why Re-simulation process make OnUpdate run multiple time?

when client receives new snapshot and apply it.
OnUpdate in my TestClientSystem under PredictedSimulationSystemGroup will Re-simulation in one frame.
Re-simulation will from (new snapshot tick + 1) → (targeting tick - 1 aka last ticke saved).
when process Re-simulation, Netcode under hood call my OnUpdate multiple times in one frame.
and in this frame it also call One more OnUpdate for calc(partial tick or full tick).

I debug by use UnscaledClientTime.UnscaleElapsedTime
9874908--1423833--Capture.PNG
frame1
Last snapshop: 164
Partial tick: 168.6 (targeting tick = 168)

frame2
Last snapshot: 165 (new snapshot start re-simution)
call My OnUpdate 2 times to process re-simulation (166->167)
call My OnUpdate 1 times to proess(save) Fulltick 168.

9874908--1423842--Untitled.png