ShaderGraph Working on Unity Polyspatial but not on Xcode build

Hi,

I have made this very simple ShaderGraph patch, it works on PolySpatial with AVP but when I built the Xcode project, the app crashes with an ShaderGraph MaterialX indication.

  • Is this because there is an incompatibility with this ShaderGraph and Vision Pro ?
  • Do incompatible shaders work at all with PolySpatial playback ?
  • or there is an issue in this case on Xcode and its project settings ?

I don’t understand what you mean by “works on PolySpatial with AVP/crashes on built Xcode project.” Do you mean that it works with Play to Device? At any rate, what message is given when it crashes?

I don’t see anything wrong with the shader graph. Generally, incompatible shaders (such as those that use unsupported nodes) will work partially; they shouldn’t cause crashes. If you like, you can submit it in a bug report (and let us know the incident number: IN-#####) and we can take a look.

Thanks for the reply. Yes, it runs as expected with Play To Device on AVP. It crashes when running the Xcode built. Please see the attached shots.
thank you.


Thanks; that gives us more information, and I can see some places where this could conceivably be happening, but I still can’t tell where exactly without a repro case. The first thing I would suggest is to try reimporting (right click → Reimport) the shader graph and then rebuilding, but if that doesn’t fix the issue, it would be great if you could submit the project as a bug report.

Thank you again.

  • Reimporting the shader did not help.
  • This is a remote control app project for a desktop app. So it would be a hassle for you to discover both and check.
  • I have imported the same shader to a very basic XR project template. And it worked. So it means, there is something with the conditions on the project, maybe object initialization process of which Xcode is sensitive. So there is hope.
  • Still curious about these cases where things run well on PolySpatial Play2Device but not on Xcode built. This is great loss of time; as the development goes well and fast on with the Play2Device but debugging again on Xcode is a huge hassle.

best regards.

The way that Play to Device works is to transfer over the network the internal commands that we generate/process to synchronize the Unity scene graph with the RealityKit one. Because we use the exact same interface, it should generally work exactly the same in a build as through Play to Device. The main difference is that the simulation for Play to Device runs in the Unity editor’s play mode, and the Unity editor has access to more features/resources than the Unity runtime (such as the original asset files before processing). So, my best guess in this case is that there’s some file related to the shaders that isn’t being included in the Xcode build, but is available in the editor.

Crucially, the MaterialX files that we include are based on the shaders referenced in the built scenes. If you’re using something along the lines of Shader.Find (that is, getting the shader programmatically), then perhaps we’re not discovering the shader in order to add it to the build. You might try adding an invisible GameObject that references the shader graph in a Material.

Thank you. I will report when I find the reason.

1 Like