Hi
Does anyone know if it is possible to antialias 3D objects in WebGL please?
If so, how do I go about enabling it?
I have tried setting the antialiasing multi-sample settings in edit/quality settings, but that doesn’t seem to do anything for webgl builds.
Thank you!
Thank you both very much. I’ll check these things out and get back.
Update: I looked into this a little more and checked the settings you mentioned. I also checked the Unity documentation and it does appear to say that WebGL supports antialiasing. However, the setting that is mentioned has no impact when I compile to WebGL. So either I’m missing something or the documentation is inaccurate at this point in time.
“WebGL supports anti-aliasing on most (but not on all) combinations of browsers and GPUs. To use it, anti-aliasing must be enabled in the default Quality Setting for the WebGL platform. Switching quality settings at runtime will not enabled or disable anti-aliasing - it has to be set up in the default Quality Setting loaded at player start up. Note that the diffent multi sampling levels have no effect in WebGL.”
I can’t actually see antialiasing listed in the WebGL player settings: Unity - Manual: WebGL Player settings
Clearly WebGL CAN support antialiasing since playcanvas do it very nicely: https://playcanvas.com/
Any thoughts or feedback on this would be most welcome.
Thank you!
Update again:
OK, so I think I’ve cracked it! I had a look into the way that quality settings are applied and I realised that the default quality setting for WebGL was “Fastest”, which doesn’t have AA enabled.
I made the default for WebGL “Good” and now the AA is pretty nice. Setting it to “Fantastic” makes it really really nice.
The interesting thing (and I wonder if this is a Unity bug) is that if I build as a windows exe the AA, which is also set as “Fastest” moves up to “Fantastic” automatically. I guess it is picking up the performance of the PC and adjusting accordingly. This doesn’t happen for WebGL. So it seems that with WebGL we just get what we set as default and it doesn’t adjust itself.
So basically, if you want WebGL antialiasing then just get to know the way that the quality settings (Unity - Manual: Quality) and ensure that the default setting (the green tick) is on a quality setting that uses AA.
Hope this helps someone else out there who is struggling with this.
Thanks everyone!
This 1 was in the forums(I didn’t understand what it does though)
Its a Big unity Issue
I hope they fix this in the next release
Do you have any image effects applied to your main camera? If so , disable them , cause i had the same problem and it worked only after i disabled any image effect i had applied to my main camera. Make sure you use forward rendering instead of deferred , cause deferred doesn’t support msaa. And finally set your webGL memory to 512 MB or something lower than that. Tell me how it goes
Anti-Aliasing is supported but only works if there are no Image Effects enabled.
With WebGL2.0, there is no such limitation, so ImageEffects+AA can be used together.
Been a while since this was posted. If you are building a WEBGL app in unity 6, and your build appears low resolution, not-anti aliased… ensure that your quality settings in “Player Settings” look like this: