WebGL and AntiAliasing

Hey guys. I’m currently building for WebGL and I’ve noticed something a bit frustrating:
WebGL doesn’t seem to use Qualitysettings Antialiasing. Look in these images, one from WebGL(one with development build tag) and the other from editor.


Here the difference isn’t anything to raise alarm for. But on my other project where I use LineRenderers and unsmooth geometry, it looks absolutely disgusting.

I hope I didn’t miss anything, but why doesn’t WebGL use antialiasing, and how can this issue be rectified? Thanks

You aren’t alone on this. I haven’t heard a clear explanation other how the video card parses the info. But I haven’t been able to get anti-aliasing to work since day one with WebGL. If you find a solution, please post it here.

FWIW, Anti-aliasing should work in WebGL (it does for me in Safari, Chrome, Firefox). However, you cannot change Anti-Aliasing at runtime, so the default quality set for WebGL must have Anti-aliasing enabled, so that Anti-aliasing is enabled at startup time.

But, considering reports of anti-aliasing not working at all, maybe some combinations of browsers and GPUs don’t support it? Does this simple, non-unity WebGL sample anti-alias for you: Triangle Egg - JSFiddle - Code Playground ?

@jonas-echterhoff_1 - are you saying in the quality settings? I haven’t noticed a big difference on those settings with webGL. What would be great is camera anti-aliasing. But I seem to remember it isn’t compatible with the latest version…or atleast the script gives me errors.

Hey guys. Thanks Jonas, you’re the first Unity Dev I’ve had in a thread. Anyway, the example link you sent me doesn’t anti-alias. The effect negligible because of it’s size, but its a no.

I was doing some research on the topic and I found out that anti-aliasing is disabled on some OS and GPU combinations. Essentially, browsers like Chrome and Firefox have blacklists where they disable the effect. I tried overriding Firefox’s force antialiasing flag but to no avail. I use Unity on OSX Yosemite, Chrome and Firefox, Intel HD4000. I also tried running the webgl app on my windows machine (also with an IGP) but there is still no antialiasing. But from what I read, antialiasing might work on systems with discrete cards that run windows.

By the way, I notice something interesting with WebGL builds: they seem to take some time - however little - to rev up to their target frame rate.

Anyway, I’ve decided to not use antialiasing at all. I’m targeting computers used in educational institutions (so no fancy performance) and I need to maintain 60FPS through out. The antialiasing image effect kills performance (13FPS) and Chman’s SMAA port for Unity also kills performance equally. It seems to me that WebGL is such novel technology that so many things are shaky.

Oh, and because you’ve noted that it works for you, I’ll keep it enabled just in case it might work on some target computers.

Ah, that is interesting to know. If the example link does not do AA, then, yeah, no AA for you in any WebGL content (Unity-made or not) - but you seem to have figured out the same.

About “reving up to the target frame rate”, that could be because the JIT takes some time to optimize the code. In Firefox, which has asm.js (where all the code is compiled ahead-of-time), that should not be the case. In the future, when we have WebAssembly, this should no longer happen.

Okay I see. Thanks Jonas! Is there a way to close the thread?

At Triangle Egg - JSFiddle - Code Playground anti aliasing looks working good for me (latest version of 64 bit Chrome), but I can’t get Anti Aliasing to work. Not through the Quality Settings, nor with an image effect. I am not changing AA at runtime, and the quality settings are 100% confirmed to have AA turned on. What can I be doing wrong?

If image effect antialiasing doesn’t work, then you have a big issue. The image effect should work.

1 Like

Threads aren’t closable on the whim of the OP, they’re closed if there’s a problem.

Maybe worth noting:

If you have set multiple quality presets (some of them with AA, some of them without AA) and you change between them with SetQualityLevel(x, true)
… after reloading the page the anti-aliasing setting of that quality level will be applied.
So you can somehow hack together a working Antialiasing-Switch, using PlayerPrefs or LocalStorage to remember the new quality level so you can set it again (otherwise the default quality level will be set again after load and if you start the webgl app again later it will have your AA setting reset)

This worked great for me @dark_end thanks for that tip. Has anyone been able to get high quality aliasing with sprites? Or text rendering that is super clean?

Hi,

not sure if this is the same problem but the behaviour I’ve observed since the launch of WebGL is that Quality-Settings AntiAliasing works as long as you do not use any image effects on your camera. Many (if not all, haven’t tested them all) image effects such as Bloom or SSAO when applied to a camera cancel the Quality-Settings AA the moment they are activated in a scene! So for us this has been a decision we had to make from project to project: Is good AA more important or do we need image effects for our scene and can then fall back on the image-effects-based AA solution that produces much poorer results.

Steps to reproduce:

  • Create a simple scene with some cubes in it
  • Attach for example SSAO to the camera but disable it
  • Use a simple script to activate SSAO in scene with the press of a button
  • Set Quality-Settings AntiAliasing to something like x4

Behaviour:
Quality-Settings AntiAliasing works when the scene first starts but is instantly deactivated/overwritten/whatever once I press the button to enable the SSAO.

Is this supposed behaviour, or is it a bug?
Will we see a fix for that anytime in the future?
Could someone clarify please?

This is indeed a bug with WebGL (see Unity Issue Tracker - [WebGL] Builtin Anti-Aliasing does not work in scenes with Image Effects) but I can’t say when we’ll have a fix for it.

Any news on this?

This requires support from the browsers for multisampled renderbuffers. WebGL 1.0 does not support this. WebGL 2.0 will in Firefox 47: 1094458 - Implement RenderbufferStorageMultisample (not sure if/when this will be supported in Chrome).

As far as I know PlayCanvas offers FX-AA together with bloom and others posteffects.
In Unity both SMAA by Thomas Hourdel and AA from the CinematicEffects (Alpha) do not work.
Why possible in PlayCanvas and why impossible in Unity?

Oh, that is a different question - you are not talking about multisampled rendering, but about AA done using postprocessing. I will need to look at each of those shaders to be able to answer that. Might get a chance to do that later this week, and will get back to you.

Thanks @dark_end for your tip!

I don’t understand why, but unity-made-WebGL always starts with AntiAliasing setted off… (I use Unity 5.3.4f1)
I used the PlayerPrefs to save a flag the first time the QualitySetting was setted to activate AntiAliasing (in the Start method). Then, when I reload the page, it activates the AntiAliasing…

@jonas-echterhoff_1 is there any way to set the default quality setting of unity-made-WebGL to the higest to start always with Anti Aliasing activated? I setted that in the Edit → Project Setting → Quality
and using the code in the Start() and Awake() methods with QualitySettings.antiAliasing = 8; and QualitySettings.SetQualityLevel(5, true); … But I have always to refresh the page to see the AntiAliasing activated…

Thanks in advice for any help about that!

Has Unity fixed anti aliasing with WebGL?