Render Queue Control

Hi,

I am looking for a way to control the render queue for certain materials, so that I can ensure that Material A is rendered before Material B. What I tried:

  • Changing the sorting priority / overriding the value for the render queue for the material => no effect in the simulator (in the Unity editor it worked fine)
  • Applying multiple materials on one mesh renderer => Only the first material is rendered (in the editor the materials are rendered on top of each other)
  • Adding Render Objects in the URP settings => no effect in the simulator (but in the editor)

The only option I found either adding a shader to the Opaque-pass or Transparent-Pass in Shadergraph-Shader. Unfortunately, I need a more fine control.

All in all, it seems that I can only control rendering aspects inside of ShaderGraph. General URP-settings seem to have no effect on the rendering in the Simulator. For Instance, I set the render scale in the URP Pipeline Settings on 0.1 and it looked terrible in the editor, while in the simulator everything was fine.

Is this intended or a temporary state?

The only way to control render order at present (in MR mode) is to use the PolySpatial Sorting Group component, which allows you to create groups of renderers that render in a fixed order with respect to each other.

General URP settings aren’t transferred in PolySpatial, and typically can’t be unless there’s a specific equivalent in RealityKit. That’s how PolySpatial works: it maps materials/components/entities in Unity to their equivalents in RealityKit, such as mapping PolySpatial Sorting Group to ModelSortGroupComponent. It doesn’t actually use URP to render on visionOS (and, in MR mode, cannot: low level Metal access is unavailable); it simply uses the URP materials as source information to create equivalents. That mapping isn’t complete–and probably never will be, given that we’re limited by the constraints of RealityKit–but we are continuing to improve it over time.

1 Like

Thank you for the fast response.

1 Like