where does it actually give you the Fps count? I’ve tried both the wiki fps counter scripts as well as the internal profiler- currently my game is pretty slow, around 10-15fps at most, however the fps counter from the wiki says it’s running at 33.33fps (set to 30fps in xcode)- can someone tell me how to get an accurate readout?
In the profiler output example on that page you will see the following:
frametime> min: 31.9 max: 37.8 avg: 34.1
This is how much time (in ms) it takes to render 1 frame. So if we look at the average there of 34.1ms it takes some simple math to get your FPS (1000 ms in 1 second, so): 1000/34.1 = 29.33 FPS
It’s actually better when measuring performance to think in terms of milliseconds per frame, rather than frames per second. The reason is that FPS isn’t a linear measurement. 30 fps is 2x slower than 60 fps (30 fps difference), and 15 fps is 2x slower than 30 fps(15 FPS difference). This means that saying ‘This fix gained 10 FPS’ represents entirely different improvements depending on what your FPS was before. Gaining 10FPS when you were running at 120 is small potatoes, but gaining 10 FPS when you were at 15FPS is MASSIVE.
However using Milliseconds per frame, is a linear measurement and a 2ms improvement is a 2ms improvement everywhere, making measuring differences between builds/optimizations/logic blocks much easier to compare accurately.