I had heard a long time ago that using < was computationally faster than <=. The reasoning was that <= was just a shorthand for < and =, which meant that the program would have to perform two comparison checks. Can anyone confirm or deny this?
Stuff like <
vs '<=
is pretty far down on the list of things to worry about. The usual rules, in order, are:
o Don’t optimize early. Always choose to write a readable program over a “fast” one. There’s a good chance things will change and your optimized code section will be a pain to update.
o Don’t bother with infrequent code sections. Monsters only die once, so no point making “die” run faster. Focus on things that happen every frame.
o Look at the big picture. Does target acquisition need to run every frame? Does every monster have a local copy of common data that could be stored in just one place (bigger programs can’t fit into L1 cache, so run slower.) Shouldn’t that giant nested-if/switch be turned into an array lookup?
o Don’t assume you’re smarter than the compiler. It’s likely that if(a<=b) do C; else do D
will get flipped around during compile to if(a>b) do D; else do C
.
<= is not any faster or slower than <. You can confirm this by running a simple benchmark. See the question about this on Stack Overflow.