I thought AnimationCurves are kind of nice and intuitive to use, especially for artists but also if you want to be more flexible. But I assumed that they would actually incur a little performance penalty for the flexibility.
Assumptions are never a good idea, so I started doing some little testing and found out something that really surprised me: While AnimationCurve takes 10-30 times as long as Mathf.Lerp on the first call (but we’re talking 0.1ms vs. 0.01ms, so it’s not like a big thing in absolute numbers), when you go over several loops, it actually is a little faster than Mathf.Lerp - even when you have a somewhat complex curve.
Does anyone know how this is achieved? Mathf.Lerp seems quite trivial - certainly much less interesting than evaluating a bezier function, so this is totally counter-intuitive. Either I have something wrong in my test-script, or there’s magic going on.
Here’s the test-script:
using UnityEngine;
public class SimpleCurve : MonoBehaviour {
public AnimationCurve curve = new AnimationCurve();
public int counter = 1000000;
public float Evaluate(float t) {
return curve.Evaluate(t);
}
public void Awake() {
for (int i=0; i < 10; i++) {
TestA();
TestB();
}
}
private void TestA() {
System.Diagnostics.Stopwatch sw = new System.Diagnostics.Stopwatch();
sw.Start();
float result = 0;
for (float i=0; i < counter; i++) {
result = curve.Evaluate(i / counter);
}
sw.Stop();
Debug.LogFormat("Using curve for {3} iterations took:"
+" {0} ms, {1} ticks, result was: {2}",
sw.ElapsedMilliseconds, sw.ElapsedTicks,
result, counter);
}
private void TestB() {
System.Diagnostics.Stopwatch sw = new System.Diagnostics.Stopwatch();
sw.Start();
float result = 0;
for (float i=0; i < counter; i++) {
result = Mathf.Lerp(0.5F, 1.0F, i / counter);
}
sw.Stop();
Debug.LogFormat("Using lerp for {3} iterations took:"
+" {0} ms, {1} ticks, result was: {2}",
sw.ElapsedMilliseconds, sw.ElapsedTicks,
result, counter);
}
}