We created a sample scene with 4 balls, each of which has an initial velocity.
Shortly after hitting play they collide, and the physics breaks as illustrated in the video below:
5uzq3f
We also noticed that since the latest releases physics-based objects seem to be not fully smooth, meaning in some frames they jump, we tried the following:
set simulation rate to same framerate as headset (72fps)
leave simulation rate as default (60fps) and interpolate predicted physics objects
In Editor with network latency disabled (or enabled) with the same result
In android build + linux server, same result
We did not create any variants for higher precision (default Transform3d serialization, defautl physicsvelocity serialization)
And that even applies to a sphere with a simple velocity moving through a room which is predicted (similar to that illustrated above).
If I remember correctly this was not the case in previous releases with netcode + physics.
We don’t allow that. We are forcing physics to have the same rate of the rendering rate on the client. The server though has a bug and doesn’t do that (and that is a bug), generating some other problems as well (like smoothness and jerkiness).
I will check the project though and see if the problem is the same.
We don’t allow the PredictedStepSimulationSystemGroup to have a tick rate less than the SimulationTickRate. It can be higher or be an integer multiple of it.
Why you need to have the physics system run at 60hz (so slower) when the simulation actually run faster that that?
We are perfectly fine with running at the same rate as the presentation framerate (either 72fps or 90fps).
But even at these rates, the physics moved objects aren’t smooth. Even when we simply set the position of object directly from a float3 input, it sometimes jerks. (even when we “fake” the input to just follow a cosine function)
On the server side we explicitly set the simulation + physics rate in the following way:
public void OnCreate(ref SystemState state)
{
Debug.Log("Setting ClientServerTick");
var rate = 72;
if (state.WorldUnmanaged.IsServer())
{
if (!SystemAPI.HasSingleton<ClientServerTickRate>())
{
var e = state.EntityManager.CreateEntity();
state.EntityManager.AddComponent<ClientServerTickRate>(e);
}
var tickRate = SystemAPI.GetSingleton<ClientServerTickRate>();
tickRate.SimulationTickRate = rate;
SystemAPI.SetSingleton(tickRate);
}
state.World.GetExistingSystemManaged<PredictedFixedStepSimulationSystemGroup>().RateManager.Timestep
= 1f / rate;
}
(we also set it on the client as otherwise we get a warning, in the last line).
Is the jerkiness we currently still observe even present if we set the RateManager as in the example above?
Sorry, will be back to you guys asap about this.
The setting here is correct. For the smoothness part I will check the case you reported. But I presume is all due to partial ticks and how some things are handled.
But let me check
Thank you, it can certainly be that it is due to a lack of understanding on our end. I can also create a new case where we “fake” the input and provide a continuous movement of the player’s hand.
We can see that in some frames the player’s hand entity has a jerky movement even though the provided input is continuous.
If that could help let me know and I can prepare a shareable test case with a description.
That is indeed strange, I was pretty sure I double-checked that, but I can’t reproduce it anymore, the behaviour did not look like a simple collider displacement issue, sorry for taking up your time for such a simple mistake.