Unity 3.2 frame rate display

I use the frame rate indicator in the Stats pop-up in the Game window to measure the performance of my iPhone games.

Before the 3.2 update my game was typically getting 1000 FPS, and I could see that go up or down depending on changes I made. Since the update it is stuck at about 80 FPS, and I see no difference with the changes I make. This still happens if I change platform away from iOS.

Any ideas on what has changed? Or what I can do to get the old situation back?

That’s not a good idea; it’s only there for general editor info, not as a performance tool. Use the profiler instead to see what actual performance on the device is.

–Eric

It is not the only thing I use, but I find it good for testing things very quickly. More precise testing comes later.

I’ve been searching for an answer with no luck. Perhaps I shouldn’t have mentioned the iOS platform, some of my games may end up on other platforms. I’ve tried multiple project and they all display frame rates around 80 FPS. I tried the frame rate display code from the wiki and it was displaying around 60 FPS.

Has the frame rate been limited in 3.2? Is there any way to remove the limitation?

3.2 doesn’t have any frame rate limitations compared to 3.1; everything from 3.1 that I have is the same or faster in 3.2.

–Eric

Thanks Eric,

Something has changed though. By limitation, I wondered if an Application.targetFrameRate type value has been added as default.

Would someone mind checking if they can get more than 80 FPS on a very simple project, maybe just an empty scene? Then I would know if it is just me.

As I said, Unity 3.2 is as fast or faster than Unity 3.1, in all projects. A simple scene is thousands of fps, going by the built-in stats anyway.

–Eric

I have been doing more tests, changing various settings (graphics emulation, etc.), delete and reinstall Unity, roll back to 3.1 then upgrade again to 3.2, but no luck. I am still getting a restricted FPS display. This is happening on my previous project, my current project, and an empty test project.

The Stats in Game view show a high number from when I open Unity until I press the Play button and the game is running in Game view. It then settles down at around 80 FPS if I have Maximise on Play selected. If it is not selected, I get around 100-110 FPS.

I have checked on another computer with 3.2 running and I get around 1000 FPS on my previous project and 4000 for the empty project, the figures I get on my computer if I run my backup of 3.1.

The readout in 3.2 on this machine should vary between the 2 projects, but it doesn’t. I am running a 24" iMac, 2.16 GHz C2D processor, 2GB RAM, over 100MB HD free space (one of many things I tried was clearing more space), with the latest version of Snow Leopard.

Not only am I missing something that works on other computers (and did previously on this one), but I am worried that something has been set that may have other effects. Any ideas?

Quality Settings → Sync to VBL?

Thanks, but that is one of the settings I have tried. I just tried it again, turning on at all Quality levels, turning off at all levels, but no luck.

EDIT: Could it be that Sync to VBL has been enabled in some other way to cause this issue?

EDIT 2: if it was Sync to VBL related, there would be no difference between Maximise on Play turn off or on, but there is. Turned on the ms/frame increases by 50%.

Solved:

I deleted the Unity app, and all references to Unity in Library/Preferences (including the com.unity ones, but all were backed up just in case), then reinstalled Unity. Now I am getting the correct FPS display.

Strange thing is, previous to 3.2, I would get a slightly higher frame rate when maximising on play, but now it is much lower.

EDIT: This happened a second time. All I needed to do to correct it was delete all the Unity references in Library/Preferences. It does mean losing any custom window layouts in Unity, but it is a small job to do do again.

Hi, I just wanted to add that the in-editor FPS display is not really that accurate and shouldn’t be taken that seriously when investigating performance. Use profiler, or even a fps script.

Also, you should be doing real performance testing by building your project on the intended platform, and not in the editor.

As I mentioned earlier, I use the FPS indicator for quick testing when I am trying different things, and have found it accurate enough for the broad strokes. Later in the project, when I am looking at optimising is when I will do more accurate measurements. My current requirement is fast iteration to progress through different experiments, not absolute accuracy.

In 3.1 it was quite accurate. In 3.2 it is accurate too, it displays exactly 1/2 of what shows profiler :wink:

Moonjump did you by accident changed the graphics settings on your computer?
This could have a heavy impact on performance and you may have forgotten about that.
Not that I think you are dumb, but these things happen, and this happened to me once :smile:

I don’t think so. The fact I solved it by removing and replacing Unity and it’s preferences suggest something was set within Unity. Also, it wasn’t that there was an impact on performance, the frame rate display would still have varied according to load, but it didn’t. I did have a Application.targetFrameRate line within one of my scripts, maybe that stuck somehow (a big guess there).