We have found in our project that materials with transparency are not working when we build them and open on visionPro device. For now it seems to just not working, so for example when we would like to have particles with transparency it is impossible right now to use them, instead of that we have to use for example custom meshes on particles to have proper shape. Any idea when it will be fixed?
for me when I tried to use shader with alpha clipping on a mesh (to make some of the objects disappear), it clipped correctly in the simulator, but when I used the same shader on the device it didn’t work at all, rendering as if alpha clipping wasn’t enabled at all.
+1 to this.
Our transparent shaders are rendering in the Simulator without issue, but don’t render at all on the device (which is on visionOS beta 5).
If it helps, we are using Built-In Pipeline ShaderGraph materials. Having the transparency not working is a showstopper, and used to work when the device was on beta 3.
Still yet to test this using 0.6.3 and Unity 2022.3.13f1 on device. This still works in the Simulator. I’ll add an edit when I have the result on device
Thanks for the heads-up. I will do some device testing and see if I can find out what’s going wrong.
So far, I am unable to replicate this on device using PolySpatial 0.6.3 and visionOS beta 6. I tried a built-in Unlit Opaque shader graph with alpha clipping and a Transparent shader graph without alpha clipping; both gave the expected results. A Transparent shader graph with alpha clipping gives the same result as an Opaque shader graph with alpha clipping, but that’s expected (RealityKit supports alpha blending or clipping, but not both–clipping overrides blending).
If you can submit a repro case and let me know the incident number (IN-#####), that would help me debug by showing me the exact setup you’re using.
I managed to test our issue on device and we are getting the same issue using 0.6.3. It would appear our issue is related to our shaders, but I’m not discounting it might be something else in our general setup.
I have submitted a repro project showing our exact setup attached to IN-62881. Hopefully this will give a clue as to where things might be going wrong!
I could reproduce it, and I think I found the issue. There seems to be some difference between how the simulator and the device clips content; perhaps a difference in the size of the application volume, which seems like something we’ve run into before. At any rate, if I move the ScreenSetup transform to the middle of the volume (Z = 0), the rendered view shows up as expected. Pretty cool effect!
Thanks so much for looking into it!
For clarity, was the UI layer also rendering correctly when you moved the ScreenSetup transform back to the centre? This uses alpha blending so I’m concerned this might not be working correctly on device compared to the simulator.
Thanks again for your help
I wasn’t looking for it, so I didn’t notice. I can try again, however.
If it helps, you can play the VPRender scene in the Editor and enable the DebugCamera transform. This should show what we expect to see on the device in the Game view
It looks like some of the content is being cut off, but it’s not an alpha issue, anyway. If I set the transform on ScreenSetup to Y=0 (so that it doesn’t get cut off at the bottom), I see the three buttons at the bottom and four dots at the top. If I double the size of the volume camera in all dimensions, I see everything (including the rounded corners).
Glad to know this isn’t the shader!
Thanks again for having a look, I don’t have a device at the moment so I’m having to rely on the Simulator output or remote debugging like this!
Am I right in thinking that this isn’t the intended behaviour for the Volume camera? You mentioned:
is this being worked on under the internal bug that I got a notfication of?
That’s correct; we would expect the volume sizes/clipping to be the same on simulator and device, and I’m not sure if this is an issue on our end or Apple’s. I’ll investigate further and report the issue to Apple if I can duplicate it with a non-Unity project.
After investigating further and consulting with Apple, we determined that the volume size issue comes from the Display → Appearance → Window Zoom setting. If you change it from the default “Large,” you get a volume that doesn’t match the size requested in PolySpatial (e.g., on “Small,” you get a volume of about 75% the size). We’re still trying to determine whether this is expected behavior on visionOS, and thus whether we should be scaling our content to match the volume size.