I am working with a small design firm in Los Angeles on a public art installation utilizing the Unity engine. We plan on using several displays and several computers synced over a gigabit network to create a stitched orthographic display. It is absolutely essential to sync all instances of the application so that there are no discrepancies between displays.
I have come to testing particle systems across network instances, and I’m finding some slightly different results. I have been creating a network view for each attribute (transform, particle emitter, particle renderer, particle animator), but I still notice slight variation in placement and movement of individual particles.
Has anyone had previous experience with this level of accuracy in particle rendering?
Did you ensure to use 100ms delay aspect found in the network example?
Otherwise the variation might be a trivial network latency related thing.
Generally though physical simulations aren’t stable over varying cpu / cpu manufacturers, they are only stable on gpu
Well I am running the sendrate at 60fps and testing on a single machine only thus far, so there is no variation in hardware. I create a server instance which network.instantiates the objects, and opened two client instances to receive the current state of everything. I’m also using unreliable network view to continually send the current state of everything.
The variation I see is not caused by latency, but rather particles just not really being in the same place and moving the same way. The particle emitter is in the same place and moves the same way, but where and when the individual particle’s themselves occurs does not exactly sync.
Is this level of control even possible?
The issue you see there is normal.
Particles are “visual effects” basically, they are not fully deterministic even if the setup for example is all the same.
Particles themself are not synced through the network at all (70bytes+ per particle per update would be deadly).
What you are likely going to implement is something like “simple particle” which does not use full physical simulation but more simple simulation along the line of newtons law or similar.
Question is why you require them to be synced, potentially there is a better way
Like I explained initially, we are stitching many displays together and will be using many computers to drive them. We are using orthographic projection so that there is no distortion when stitching monitors together.
70 bytes per particle per update is nothing when we have a gigabit per second internal network to transfer data. Do not worry about the hardware or network constraints. I need to know is it technically possible to synchronize individual particles within unity? Otherwise I need to write a method of doing so in C#.
you can syncronize them.
But you have to write an own monobehaviour that serializes the networkview bitstream itself and reads the information required to forward.
you would then drop this script on the game object with the particle system and drag this script into the network view to delta update (or unrealiably update)
Though the question is how many particles you have because you still have to serialize every particles transform, color and potentially a few other attributes (depending on what you use) and might run into the situation where you exceed the allowed size.
also I’m not fully sure about the impact on the physical simulation of the particles in this case, but it should take you much time to write a basic transform syncer to see the impact.
but with your setup I understand why its that important that you have them in sync naturally
It’s my job at this point to push the system to it’s brink in order to evaluate what we will be able to do for the final art installation. I will have to pursue the serializing script tomorrow.