I’m having some trouble understanding how to effectively turn off VSync in Unity and/or my machine. I did this, of course:
Then, realizing that VSync could also be forced upon applications by the graphics card drivers, I went to the config software for my video card, too. It’s one of those integrated Intel Graphics processors built into the Core i7, an HD Graphics 4000, to be precise. Going to its Control Panel, I found this under the 3D settings and changed it from “On” to “Use Application Settings”:
With these settings, I’m expecting it to obey Unity’s request for not syncing, and I would expect it to just race ahead and blast frames at me, with no concern for screen tearing. Yet, when I run an external build against the profiler, I’m seeing this:
Notice the huge amount of time spent on Device.Present in the GPU profiler. This is usually the “wait for vsync”-task. Why is that still on? Are there other places to disable it I just haven’t looked?
It’s worth noting that it’s only on under certain conditions. If I start up an empty scene, it works as expected, and the task is < 1ms. I just don’t understand why, under some workloads, it suddenly kicks in, even when disabled in both Unity and driver software? And when it does kick in, why 33ms? Along with the ~8ms from the other, actual GPU tasks, that lands the GPU at ~41 ms. What does that accomplish? That doesn’t correspond to any syncing frequency. It’s certainly not waiting for the CPU, either. The sum of all CPU tasks is giving me <5ms.
I’m on Windows 7, and I even considered that it could be forcing VSync at the OS level to support its 3D accelerated Aero interface. But disabling that and running the Windows Basic profile does nothing for this issue. I’m out of ideas. O_O Anybody know what exactly this is, and if it’s VSync, why does it seem to aim for something that isn’t an integer division of 60 FPS?