Texture.Apply and material property settings timing

Hi all, I have a use case where I have a mesh with changing vertex properties (e.g. UV layout, num vertices, triangle layout) and a changing texture for each frame of the “animation”.

When I update the texture, its properly synchronized with the Unity “ghost renderer” i.e. the renderer used for rendertextures. However, it appears to update between simulation steps for the reality kit renderer.

My question is: is there a callback where we can synchronize these material updates with the reality kit’s updates from the mesh component? The only other workaround I see is double buffering the dynamic mesh and having two separate materials. Thanks!

Hmm, can you give some more info about the sequence of events/in what frame you’re doing what, and what you’re observing? If I’m understanding right, you’re doing this:

setup:
- create a Mesh, put it on a MeshRenderer that's connected to RealityKit
- create a RenderTexture and attach it to something that's rendered by RealityKit

---- Unity frame start
... modify that same Mesh
... use that mesh in rendering something to the RenderTexture
---- frame end

and at that point you expect the mesh that’s rendered in your RenderTexture and the mesh that’s rendered by RealityKit to be identical?

If so, that should work, but the problem is that RealityKit has no concept of frames. We batch everything per Unity frame, but we have no visibility into RealityKit’s own buffering or ordering. So what could be happening is that while the mesh update is being sent at the correct time, it takes RealityKit an extra frame (though not 100% defined to be an extra frame!) to get that mesh all the way up to its own renderer.

Unfortunately there are no callbacks or any way to get information about this process. I’ll raise it as an issue, though.

That’s not quite right on the use cases:

Imagine we have a flipbook of textures, which match to particular frames of a dynamic mesh (the textures are dynamically loaded from a video file or procedurally generated or whatever).

We want to have the dynamic mesh display known content with a given texture we SetPixels on and then Apply. Ideally, we’d reuse teh same mesh and same texture for each frame. In practice it seems that the texture Apply (and perhaps other material properties) take place right away, while the mesh update happens at some point in the future.
setup:

  • create a Mesh, put it on a MeshRenderer that’s connected to RealityKit

---- Unity frame start
… load data from disk
… modify that same Mesh (verts, uvs%, colors%, index buffer%) %means some frames
… modify a texture that is on the mesh
… update material properties
---- frame end

What were seeing is that the updated texture is sometimes (often) displayed on the previous frame’s mesh on the RealityKit side, while a Render texture pointed at the seen and displayed on a quad to see how metal renders the data, is in sync.

PolySpatial updates mesh assets asynchronously if you update them in-place. To force a synchronous update, you must create a new Mesh asset (but take a huge performance hit due to that sync blocking ECS update in RealityKit). See Mesh updates not synchronized

Alex: you mention seeing how Unity is internally applying changes in the linked thread. Where are you seeing that source code? It would be really useful for figuring out workarounds by understanding whats going on under the hood.

Unfortunately I don’t have source code access. The C# side can be easily decompiled with ILSpy/AvalonialILSpy, and I use Instruments with high-frequency sampling to see callstacks in the native side.