I have added support of the environment probes to my AR application, but objects start to look “wet” with this type of probes. After some researching, I have found out that this could be related to the absence of a specular convolution feature for those probes. What should be done to fix this problem? Is specular convolution supported by mobile devices at all?
Images with an example of oversaturated reflections are in the attachment.
The difference in the looks of the materials is immense between using a regular reflection probe and the environment reflections in ARFoundation for iOS. We ran into this when trying to run the same material library on both Android and iOS for our products.
Here is a comparison between the same materials with and without environment texturing:
Specular convolution is a process that is applied at texture import time by the editor to generate mipmaps for the cubemap texture. This process is only available in the editor and is not available at runtime.
In the case of environment probes, ARFoundation is getting cubemap textures with pre-computed mipmaps from the ARKit or ARCore implementation. The Unity renderer is simply using these cubemap textures in the reflection probe at runtime.
So basically there is nothing to do with ARFoundation and all problems come from ARKit itself? It seems a little bit strange because reflections in native ARKit apps look fine without oversaturated glossiness…
Yet in Apple’s documentation, the examples look like the rough surfaces work as expected, why the difference? @todds_unity could you shed some light on that?
How do you get around these limitations and still have a manageable way for artists to work with roughness in AR apps?
Hi everyone, like @todds_unity mentioned, AR Foundation is only delivering the environment cubemap textures from ARKit to the Unity app. How your content looks at runtime is then dependent on how you use these textures.