Wait for Target FPS increases while changing color value of a UI Image

I’ve encountered some strange behaviour with WaitForTargetFPS being reported and observed in Unity projects built for VisionOS. From building an empty scene and attaching the Unity profiler, it can be seen that the application is running at a stable 90 FPS (with WaitForTargetFPS taking about 11 ms).

After adding a single Canvas with an Image component the behaviour differs a little. In the MacOS Simulator the WaitForTargetFPS instantly jumps up to 16 ms (60 FPS), but on device this stays at 11 ms. I’ve ended up with a test case that shows a canvas and image which has a script attached that changes the colour, and this can be seen to cause both the simulator and device WaitForTargetFPS to jump up to 33 ms (30 FPS).

This was all done with Unity 2022.3.24f1, Apple visionOS XR Plugin 1.1.4, PolySpatial 1.1.4, PolySpatial visionOS 1.1.4, and PolySpatial XR 1.1.4.

Here is a WeTransfer link is a zip file of the Unity project that shows what we tested on device, as well as a recording of the application running, and accompanying profiling data: WeTransfer - Send Large Files & Share Photos Online - Up to 2GB Free

2 Likes

We’ve also experienced performance issues with UI (~halving FPS), however being able to find exactly what’s causing it is not something we can get to quickly.

Would be keen to know if any issues have been acknowledged with the current implementation of UI, and if a fix is in the works.

While I’m not sure what the issue here is, note that the profiler is only profiling the simulation side of things and not the rendering side.

That said, looking at profiling locally, I can tell you that PolySpatial is not doing anything within the context of that wait, and I’m not sure why it’s doing that. I’m going to ask around to folks that might know more to see what may be up.

WaitForTargetFPS is a bit misnamed here – that function is waiting for the OS to tell us when to render the next frame; it’s not Unity deciding what the target FPS is. We’ve seen situations where the OS delays significantly in asking us for the next frame because the system renderer is busy/overloaded. Material property changes seem to be a culprit; those are extremely expensive with RealityKit.

I’m not so sure it’s Material property changes that are causing the issue. I made a custom shader via ShaderGraph with just a Color property, and swapping between setting that using Material.SetColor() vs setting Image.color every frame has significant differences:


The left half is setting Image.color and the right half is using Material.SetColor()

Yeah, this particular case is due to mesh issues. First, because of the way that Image/RawImage work, we were creating new RealityKit MeshResource objects for each color change, rather than updating the contents of existing MeshResources. Second, we were creating new ShapeResources (used for raycasting) even if only the color changed. We have a fix in the pipeline that addresses both those issues and thus fixes the FPS for color animations (at least). I’m not sure when it will be released, but hopefully soon.

However, there is another issue that we’ve seen related to material properties (specifically shader graph properties). For instances, in a scene with thousands of meshes using a shader graph material, setting even a single property on those materials will cause a big slowdown. That’s something we’ve reported to Apple, and we’re investigating workarounds such as support for static batching, in order to keep mesh counts down. However, that FPS issue is specific to rendering; it wouldn’t be visible in the Unity profiler, which only shows the update FPS. At least at the moment, the only way we know to get the render FPS (in the simulator, e.g.) is to use Xcode instruments (specifically RealityKit Trace).

1 Like