Hello everyone. I have a script in order to fade objects which are too far from the camera. It uses the Vector3.Distance to get an amplitude modifying the alpha renderer material. There is nothing complicated :
However I have some trouble applying the same effect when the object is too near from the camera. I am trying to get a script which can fade objects which are too far - and objects which are too near. So you can see the objects only when they are between two distances. I could add a condition according to the distance, but I guess that there is a better way no?
Thanks for your help ++
Well first of all you should simplify your current fading as it’s not clear what’s actually happening here. The ratio 500f / (dist*50) can actually be reduced to 10f / dist. Now it’s easier to follow what is happening. Your current approach will result in 100% in the distance range between 0-5. In the range 5-10 you will non-linearly fade out your object which is kinda strange as you start fading out fast and slow down as you approach alpha 0 at a distance of 10. Maybe you want to use a different approach alltogether here. You use the inverse distance to fade out. This is a 1/x curve.
Your approach makes it very difficult to specify the actual distance at which the fade out should start and how long the fade out should take. Usually it makes more sense to just use an offset. So if you want to fade out at distance 5, you just subtract your distance from 5 + 1. This will give you a value larger than 1 when the value is smaller than 5 and would fade to 0 over the course of 1 unit. Once you have that you can introduce the fade out length which would simply be (maxDist + range - dist) / range or simplified to 1f+(maxDist - dist) / range
This would start fading out at minDist and last for range units.
To combine multiple ranges you can simply calculate them separately and just multiply them together. This works as long as they don’t overlap. So if you want a fade out at a certain closest distance over the course of x units, you can do
1 + (dist - minDist)/range
Of course all those values need to be clamped between 0 and 1. So you can do
So here as long as dist is greater than minDist and smaller than maxDist, both factors fMin and fMax would be 1f and as such you get a fade value of 1. However if it drops below minDist, over the course of 0.2 units fMin would linearly fade to 0. Likewise when dist is greater than maxDist it would fade out over the course of 2 world space units.
So at a distance of 0.8 or closer and a distance of 12 or greater, the object would be completely invisible.
If you want more control over the fade out, you may want to use an AnimationCurve where you actually use the raw distance on the time axis. That way you can do whatever you want. You can simply create a public field of type AnimationCurve and edit the curve in the editor. At runtime you just call Evaluate with your distance to get your fade value.
@Bunny83 Thank you for your help. You gave me a clever explanation and I was able to modify my previous script. The wiki link is really hard to understand because of solid mathematical algorithms. I am not very skilled. I will try to add a curve animation to this fading effect. I have understood what you meant. Thank you for your help. I really appreciated it. I wish you the best ++