Blur + Distorsion effect in Apple Vision MR

Hello, I’d like to create this effect in apple vision.
It’s a project to show visual deficiencies, in MR.
The idea is to have a spot that follows the user, and where we would see through the blurred and distorted environment. My problem is that since the scene is in MR and I want to retrieve data from the real environment, I can’t use a Scene Color node in Shader Graph. Do you have any ideas? Do you think I can use the Polyspatial Environement Radiance node with a distortion node like Twirl to achieve this effect?

You could certainly try. There’s some discussion in this thread about using that approach. Because it’s using the environment map (from the device’s cameras), it’s directional only. The other option is the blurred background node available in visionOS 2.0/PolySpatial 2.X (which replicates the “frosted glass” effect used in Apple’s UI).

Hello @kapolka and thank you for your reply. I’ve already been on the forum you mentionned, indeed the polyspatial blur node seems interesting, the problem is that I can’t reduce the effect it’s way too blurry.
If I use the shader shared on the other forum, could this do the trick? I’d have to multiply it by a sphere mask or texture to get the effect only on a specific shape, then possibly by a Twirl node or something else.

It’s worth a try! One other possibility to control the degree of blurriness with the blurred background node is to mix it with the unblurred background by making the shader transparent and using a low alpha value.

I tried a shader like this one, the problem is that when I lower the alpha value the blur seems to completely disappear.

image

You need to disable alpha clipping. On visionOS, enabling alpha clipping means that everything below the threshold is entirely transparent and everything above is entirely opaque.

1 Like

I’ve tried a shader like this one, but I think that as soon as there’s a value or texture in the alpha other than 1, the shader stops compiling.
I tried with a texture with and without an alpha.

Screens in the editor/ in the simulator

image


I was able to get it to work. Note that connecting the output of Sample Texture 2D to Alpha is going to use the red channel; that may or may not be what you want.


1 Like

That was it indeed, thank you! My shader had the same properties but was in Lit, so I guess that’s why it didn’t work… To compensate for the blur that’s too strong, playing with the alpha works pretty well indeed.
Thanks again for the time spent testing and for the advice, it’s very, very useful!
Now there’s one last step I’d like to take, and that’s to distort the view through the sport, twisting the render to simulate distortion (as in the reference image above).
I’m trying to use warp, but given that it affects UV, I’m not sure how it can be used in combination with Polyspatial Blur, and given that I don’t have access to the camera data.

1 Like

The closest I’ve managed to get is this, but it’s not the desired effect, I’m having a bit of trouble figuring out where to put the polyspatial blur to distort what it reflects back from the environment.

parsecd_RolJVdor9P

Unfortunately, there’s no way to control what location you sample in the background blur node (that’s just not something Apple has provided in the node we use). It always returns the color at the location being drawn. I’d suggest requesting the ability to sample blurred values at arbitrary UVs via Apple’s Feedback Assistant.

Thanks for the replies, it’s a shame indeed but I don’t think I could go any further with this solution, ditto with the Polyspatial Environement Radiance node (I guess ?)

I’m trying to do the same thing on a “VR” scene, in a closed environment… without much result so far (the thread is here

Thanks again for the help @kapolka and your time

1 Like