Hey there.
I am tryin to use mipmap bias as smoothness value to blur my reflection render texture.
The problem is when camera moves, the mipmap has some kind of moving antialiases:
I also have tested this with realtime reflection probe on the right cube. Chanding smoothness on standard material does not have this issue.
My approach to render reflection is to have another camera to render from correct position related to the active camera and do a render to texture and assign the result to shader and modify smoothness in shader as mipmap bias value.
Any advice you could give would be much appreciated.
The only way I have managed to make something like this stable enough, is to manually generate the mips yourself, using some sort of higher quality downscale and blur shader, and then it is stabler than this.
If the reflection camera is orthogonal, you could implement some kind of stable fit, so long as the angle didn’t change. But assuming you want to change the rotation often, it will need a better blur, possibly even temporal, though that would introduce some Unreal-like smearing a lot of people don’t like (me included hehe).
Yes, much better with gaussian blur.
Now I can use ~10% of screen size instead of using mipmap.
But need to check in WebGL for performance.
Thanks guys.
in case anyone needs the GaussianBlur custom function I’ve used in amplify shader editor:
// Gaussian blur
vec4 Color = texture2D( _Texture, uv);
float Pi = 6.28318530718;//pi * 2
for (float d = 0.0; d < Pi; d+=Pi/float(Directions))
{
for (float i = 1.0/float(Quality); i <= 1.0; i+=1.0/float(Quality))
{
Color += tex2D( _Texture, uv+vec2(cos(d),sin(d))*radius*i);
}
}
Color /= float(Quality)*float(Directions)+1.0;
return Color;