I am trying build up mechanics, with determinism, for large amount of active entities.
Then I want be able to fast forward simulation, by reducing time step.
My question is, how stable determinis is, at current DOTS dev. stage? Using 0.1.1 Entities package (preview).
If you know answer, you can skip further reading. But my current experimentation gives me unstable results, which I attempt to explain further below.
In my project, I use 0.1 sec as base time unit, for most calculations across systems, other than physics, rendering and I/O.
Then on top of that, I use period of 10 steps ( base time unit * 10 intervals = period duration; 0.1 * 10 = 1 sec), in which each of 1/10 interval has some designated systems to update.
Some systems execute on each step of period. Other systems may execute on 2, 4, 7 etc. interval of the period.
Basically it looks like that
int periodStep0to9 = (int) ( baseTimeStep % 10f ) ;
switch ( periodStep0to9 )
{
case 0:
systemA.Update () ;
case 9:
systemZ.Update () ;
}
Now I have recorded inputs, and designated time steps and replaying them back. I need make sure, I can have same expected behaviour back.
However, I really have trouble, making whole deterministic mechanics stable. I have tried multiple approaches. But I think I either missing something, or determinism is still in works.
In my case, application appears to work, when systems run on base time step of 0.1s. Which is fine. But when fast forwarding 5 to 10x, determinism seams start falling apart. Systems start lagging behind and desyncing. At least that is my impression. Meaning there is difference between 0.1s speed and 0.01s.
I expect systems to execute in strict order. Even if the frame rate drops, or job duration in system exceed 1 frame. Which can be expected, when accelerating simulation. But I observer, some systems skips their group, and execute much later, rather in next expected frame.
Mind, I use EntityCommandBuffer to create entities and involved some jobs, are multi-threaded. Also renderMesh is changed, as I think this may be important.
I tried use also
job.Complete () ;
But seems I can not see any significant change, since I expect another related job, to run in next frame anyway.
I read multiple topics, which briefly lead me to using fixed time step and manual creation and execution of systems within a group.
And of course applying [DisableAutoCreation] to systems.
I also have experienced in some trials, when systems are executed twice within a frame. Briefly mentioned in
Other reading.
https://gametorrahod.com/world-system-groups-update-order-and-the-player-loop/
https://docs.unity3d.com/Packages/com.unity.entities@0.0/manual/system_update_order.html
Another discussion from March 2019, mentioning on changes, regarding updates.
So repeating my initial question, how stable current determinism is? Can I rely on it?
Or maybe I do keep making a mistake somewhere?