Graphics API at runtime

Hi!
I noticed that WebGL2 builds perform really poorly on Safari 15 (even on desktop).
If I enable the Auto Graphics API option (or manually add WebGL2 and WebGL1), it will still use WebGL2.

Is there a way to set the Graphics API at runtime before the unity instance is created?
That way I could check what browser the game is running on and select the best API accordingly, without impacting the experience on the other browsers.

Thanks!

  • David

The WebGL2 issues are currently being addressed by Apple. There’s not really a way of downgrading the WebGL version based on the runtime. This is not something we’d likely make an API or interface to do as we’re working with browsers to ensure WebGL2 is viable so we can finally move away from the limited WebGL1.

If it’s something that you need addressed faster than Apple can get it fixed, at your own risk you could work out how to replace the _JS_SystemInfo_HasWebGL function defined in Build/*.framework.js. This the function that tells Unity the maximum version of WebGL available. If you managed to detect Safari 15, you could force it to return 1, and Unity would think only WebGL1 is available. This could be done by defining the function in a jspre plugin, and it would overwrite the built-in one. Like I said…at your own risk, Apple will be fixing their bad WebKit release.

Thanks for the quick reply!!
I’ll give it a try, but should be pretty safe since it’s not intended for a production environment just yet.