How does reflection probe blending work?

Does it happen in a shader? Or does it happen in a scipt / native code?
Is the blending just a simple interpolation between the two textures… or something more advanced?

My goal is to be able to create a custom shader that uses reflection probes for the whole map baked into one big atlas so that reflection probes don’t break batching like they do by default.It is a bit of a wonder to me there isn’t anything like this out there rn, although it’s very possible I just haven’t found it yet.

1 Like

Yes-ish. Both shader and in native code, not in script, though the functions to do so are exposed to c#.

For forward rendering, Unity’s Standard Shader has the option to take two reflection probes as inputs and blend between them. If one or both reflection probes are using box reprojection, this is the only way it’s possible to blend between them.

If both are not using box reprojection, Unity can linearly blend two cubemap textures into a cubemap render texture, which is the function that’s exposed to c# for people to use.
https://docs.unity3d.com/ScriptReference/ReflectionProbe.BlendCubemap.html

However for something like what you’re doing, you’ll have to do all of the blends in-shader. Honestly I’m not sure what an atlas of cubemaps looks like in a form that can be efficiently sampled. And I guess you’d have to manually figure out what the two closes cubemaps are and somehow store the cubemap indices & blend factor in the vertex data, or otherwise allow the shaders to be able to calculate that on the fly from some kind of structural data.

2 Likes

You can’t really make a cube map “atlas”, but you can create cube map arrays, which work just like texture arrays where you pass the index of the layer you want during sampling. That’s how Unreal does it, BTW. Be aware that, just like texture arrays, all cube maps in the array need to have the se resolution and texture format.

You totally can and I totally did, while you can just make an atlas of 6 faces chunk, and decode the index in the shader (don’t forget to pad the edges), the better options is to turn them into octohedron map which is neat and square and cheap, other option were latlong or dual paraboloid or even stupid double spheremap. and they remain compatible with texture array too anyway.

1 Like

Well, that’s technically no longer a cubemap, heh.

Functionally it’s the same, and many game do as such, it’s still call a cubemap because we sample based on normal rather that UV. The storing shouldn’t matter.

Also you still can do an atlas of 6 faces chunk, that’s still an atlas cubemap :stuck_out_tongue:
First reference the chunk, then select the cube face based on major axis, that’s how it’s done in hardware.

Cubemap lookup functions are simple enough. That can get you a face relative normalized UV & face ID, which you can then use to scale and offset the normalized UVs to some position on an atlas. But then you have to contend with edge bleeding unless you put in some padding, or manually calculate the filtering at the edges … which is a lot harder. Could maybe do something a little more sane with a Texture2DArray. Doesn’t solve the edge filtering entirely, but does remove the issue of needing padding, and you can prefilter the edges (“Fixup Edge Seams”).

But yeah, octohedron or dual paraboloid setups are generally better for atlased setups. Octohedron maps + texture arrays makes things super easy.

Is blending available in Android? I’m losing blending when switching to Android with either graphics API.
EDIT for googlers: was just good old graphics/tiers playing pranks…