I’ve done many RealityKit traces in XCode instruments and to my surprise, using Time in our default shader doubles our CPU usage. So this a very critical bug for us, as the implications are that most VFX and shaders are unusable.
My theory is that it’s because Unity sends in time as a parameter instead of using a normal time node. So I suggest that perhaps Polyspatial should include a different Time node for Shader Graph that maps to RealityKits Time node.
I haven’t double checked if this problem is only when using time in vertex shader.
That’s certainly troubling. Part of the problem is undoubtedly that RealityKit’s ShaderGraphMaterial has no way to set global parameters, so we have to set parameters like Time on each material instance individually. If you have a lot of object instances, that could easily be a drain on performance. If you have time, it might be worth submitting feedback to Apple requesting support for globals in ShaderGraphMaterial (as we have done).
This is correct. We want to ensure that the value is the same as Time.time (be that scaled or paused, etc.).
That’s certainly something we can add easily. I’ll add it to the list for a future version.