How does Vsync for webgl work?

Hi, my team and I have been trying to make a webgl build for a demo of an upcoming game and we’re confused about the way Vsync works in webgl. With it enabled, it doesn’t seem to reliably match the refresh rate of the screen/monitor.

Say I’m on a 60 fps monitor with vsync enabled. Two different pc setups (one laptop and one desktop) can see FPS anywhere from 100+ to 30 on the lower end occasionally. I tried forcibly enabling vsync for the browser in the nvidia control panel I was running the webgl build in to no difference in result either.

So how does vsync for the webgl platform work?

How do you measure fps?

Vsync enabled will allow fps to drop below the monitor refresh rate of course, so the only issue here seems to be fps greater than 60 on a 60Hz monitor. Which could simply be an inaccurate measurement. Or the monitor can indeed go over 60 Hz, perhaps it does support gsync even.

Or vsync for some reason isn‘t working on that system / browser. This should be obvious visually as you should notice screen tearing while moving the camera. Is that the case? If there is no obvious tearing then it cannot be a vsync issue (though I‘m wary to say that because some people simply can‘t or won‘t notice screen tearing).

I’m using graphy to see the fps. Here are some screenshots of the game’s webgl build running above the frame rate of the monitor. First two are set to 60 Hz and the third is set to 144 Hz, but even then it still jumps above that.

My teammate has gsync enabled in the first three images, which could be a factor like you mentioned. But even with it disabled it still runs fast in the fourth image.




As far as I know, the vsync option in Unity, does nothing for web games.
The browser itself manage vsync. Chrome does it by drawing the frame 2 refresh rate cycles after the draw commands (Safari used to do it after 1, but according to the specs it should be 2. I don’t know if they fixed it).
In the browser, Unity use the requestAnimationFrame() JavaScript method to manage the update cycle, the browser use this method to time smooth animations based on the display refresh rate. So in an optimal scenario, in a 60Hz monitor, the browser will run this method 60 times in a second, and in a 144Hz display it would be 144 times.
I don’t know if in one of the latest versions of Unity they added support for targetFrameRate but in the past changing the value of this for the WebGL platform did nothing.

I’m not sure how or what you are using to monitor the FPS, but if it’s something external to Unity or the browser, it might show you the refresh rate of the browser or the HTML Canvas element.

There are a couple of articales and videos that are better explain how the rAF method and the refresh cycle in browser work.
In this one there’s a diagram that explains this

4 Likes

What @De-Panther said. For targetFrameRate, Unity uses a timer instead of requestAnimationFrame, which is not recommended for browsers.

1 Like