For the current mobile game I’m developing, the FPS from Unity stats was between 250 to 300… and running smooth.
Some people kept telling me to keep the frame rate at around 30 FPS… but when I’m running an empty scene, the FPS stats ady shows a value between 350 to 400 FPS…
I’m kinda confused on the FPS topics. Can someone explain to me the difference between the FPS in Unity stats and the FPS in iOS device? Or the relation between them?
250 to 300 is epic. But most bigger games require more resources and they cant have that high rates so they are to have at least 24 Frames, because the human eye needs 24FPS to see a fluent stream. But if its higher there is no problem with it 
Keep goin.
Cheers
Yatekii
When running in the Unity editor, it’s running on your computer. Your computer is a lot faster than an iOS device, even with the editor overhead. There is no attempt to emulate actual iOS device speed in the editor, nor is something like that feasible. The stats window is also not an accurate count of the actual frames per second you’re getting overall; use this. On the device, 30fps is the default cap, but you can change it to 60 using Application.targetFrameRate (in Unity 3.5). Since the screen on iOS devices is always vsynced, 30 and 60 are the only values that make any sense (though you could do less than 30, but that would be bad). 60 is twice as fluid as 30, but uses more power. The more fps, the better, though anything more than the screen refresh rate is a waste, since those frames won’t be seen and are therefore lost. When using vsync, the fps can’t go higher than the refresh rate anyway.
–Eric
I see, I see… Thanks I’ll have a look at that FramesPerSecond script~ 