I have a screenfader that fades the screen in and out in a linear manner. I wanted to apply a curve to it and decided on the graph of y = x ^ 1/2.
If you remember, it is supposed to look like this:
So I wrote a method and applied it to the fade and it didn’t work the way I expected. To figure out why, I took a look at what data is being output:
void Fade()
{
float count = 0;
float change = 0;
float fadeAlpha = 0;
while(fadeAlpha < 1)
{
change = Mathf.Pow(count, 1f/2f);
fadeAlpha += change;
Debug.Log("fadeAlpha is "+fadeAlpha);
Debug.Log("the change is "+change);
count += 0.001f;
}
}
Of course 1/2 could have been written as 0.5, or I could have done sqrt(count) instead, but they all give the same results.
The numbers I was seeing didn’t make sense. The graph starts out with a steep curve and this curve becomes less and less the more we go along the x axis. So this means the value of the variable change should at first be larger numbers and then smaller numbers near the end. But in reality it was the opposite. I even took the data and plotted a graph and got this:
Which is the inverse of what I’m supposed to get!
I have spent a lot of time trying to figure out where I have screwed up but have not been able to figure it out. I am using the equation
y = x ^ 1/2
but am instead getting a graph that looks more like
y = x ^ 2
I would appreciate if anyone could help me figure out what I’m doing wrong?



