This is the one that I always get into on forums:
“Myth 4: Every Optimization Yields Some Performance Gain”
People will agonize about optimizing a FOR loop in some obscure routine and no matter how hard you try to talk them into the truth that it doesn’t really matter, they are still commited to spend at least 2 man days refactoring it over and over until it’s optimal.
Bah!
With ‘Shooter in the Abstract’ I took a very pragmatic approach. I got the game working using very easy to read and easy to follow code. Then, I saved off the code branch and started to look for slow downs. I optimized the biggest offenders and when it was fast enough, I released it. You shouldn’t need to do more than unless you’re writing something mission critical for NASA.
I think this is the hardest part on optimization - to find the real bottlenecks. Apple has done a really good job in this part with their OpenGL Profiler. (as part of the Xcode tools) I think this tool should also be able to profile Unity apps.
I think it depends on your game. ‘Shooter’ wasn’t GPU bound in any way whatsoever so I didn’t really need to know what OpenGL was doing. All of my optimization was on the C# script level.
Optimisation isn’t what it was when Knuth was writing, either. Caches, pipelines and automatic memory management are major performance issues nowadays. Knuth’s work was largely about analysing algorithms mathematically and showing how an efficient algorithm ultimately executed fewer instructions than an inefficient one. However, it’s much less clear cut now (eg, it’s rare to see search trees and binary search in professional code these days because of their poor cache usage - optimising them is quite tricky).
For anyone who doesn’t want to study CPU architecture in depth, I think the best optimisation advice is to code for clarity and aim to make thorough use of standard library code. Oh, and take all classical optimisation advice with a pinch of salt. A lot of obsolete “wisdom” is still taught and propagated by people who should know better!
Knuth is still correct.
Optimization techniques change with hardware (and software) – this hardly makes Knuth’s fundamental point: write correct code first, optimize second or not at all – obsolete. The fact that some of Knuth’s algorithms are no longer efficient on current hardware is quite beside the point.
Incidentally, Knuth’s remark has been taken to mean that thinking about performance when designing software is a bad idea. His actual quotation is this:
“We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%.”
This sounds to me like carefully crafting every for-loop to shave instructions (or unrolling loops) is usually a waste of time, or worse (especially if you don’t know that the code is performance critical), but very worthwhile in specific cases; and not that you shouldn’t consider performance when designing a software system.
I don’t think any changes in technology will affect the correctness of these sentiments, because they’re not that specific.