I just built a new PC with an Intel Core i7-6700k CPU and GeForce GTX 1080 and inexplicably I am getting much lower frame rates than expected. Even lower than my MacBook Pro running an AMD Radeon R9 M370X.
I’m testing this with a default new project and a default empty scene using fresh install of Unity 5.4.2f2. No scripts or anything have been added. And I’ve used the exact same setup on both computers for comparison, including the quality settings and graphics emulation off.
My PC is rendering between 30-48fps, while my macbook is rendering 75-82fps. While playing the scene on my PC, HWMonitor shows the GPU is only running at 23% and my CPUs are all at less than 10% except one which is around 60%, And in Unity the CPU is averaging over 40ms, which should be much much lower. It appears that this is the bottleneck in the fps.
What’s super weird is that when I use the Profiler to see what’s going on, suddenly the frame rate jumps up to around ~70fps and the CPU drops down to ~17ms. I close or hide the Profiler and it goes to ~35fps and ~42ms.
Temps and stats for my hardware are all totally fine. For some reason it seems that Unity is not fully utilizing my hardware and I don’t understand why. It handles more complex scenes ok, but the frame rate is still not getting to 60fps where I’d expect it to be.
I’ve run several 3DMark tests and have received good scores appropriate for my setup. For example, my score for Fire Strike Extreme was 9588. I’ve overclocked both my GPU and CPU well within acceptable tolerances. The frame rater problems were occurring before I overclocked btw.
As I’ve been researching this it seems that many others have had frame rate issues with the GeForce 1080, but no solutions I’ve found have worked for me yet. I can’t figure out what I’m missing. Is it something with Unity? An Nvidia setting? Or is there something wrong with my hardware?
BTW my BIOS and drivers have all been updated to the latest as of today. Also running the latest Windows 10.
Any suggestions much appreciated. Thanks!