Hey guys. I'm currently building for WebGL and I've noticed something a bit frustrating: WebGL doesn't seem to use Qualitysettings Antialiasing. Look in these images, one from WebGL(one with development build tag) and the other from editor. Here the difference isn't anything to raise alarm for. But on my other project where I use LineRenderers and unsmooth geometry, it looks absolutely disgusting. I hope I didn't miss anything, but why doesn't WebGL use antialiasing, and how can this issue be rectified? Thanks
You aren't alone on this. I haven't heard a clear explanation other how the video card parses the info. But I haven't been able to get anti-aliasing to work since day one with WebGL. If you find a solution, please post it here.
FWIW, Anti-aliasing should work in WebGL (it does for me in Safari, Chrome, Firefox). However, you cannot change Anti-Aliasing at runtime, so the default quality set for WebGL must have Anti-aliasing enabled, so that Anti-aliasing is enabled at startup time. But, considering reports of anti-aliasing not working at all, maybe some combinations of browsers and GPUs don't support it? Does this simple, non-unity WebGL sample anti-alias for you: http://jsfiddle.net/59v3101e/17/ ?
@jonas echterhoff - are you saying in the quality settings? I haven't noticed a big difference on those settings with webGL. What would be great is camera anti-aliasing. But I seem to remember it isn't compatible with the latest version...or atleast the script gives me errors.
Hey guys. Thanks Jonas, you're the first Unity Dev I've had in a thread. Anyway, the example link you sent me doesn't anti-alias. The effect negligible because of it's size, but its a no. I was doing some research on the topic and I found out that anti-aliasing is disabled on some OS and GPU combinations. Essentially, browsers like Chrome and Firefox have blacklists where they disable the effect. I tried overriding Firefox's force antialiasing flag but to no avail. I use Unity on OSX Yosemite, Chrome and Firefox, Intel HD4000. I also tried running the webgl app on my windows machine (also with an IGP) but there is still no antialiasing. But from what I read, antialiasing might work on systems with discrete cards that run windows. By the way, I notice something interesting with WebGL builds: they seem to take some time - however little - to rev up to their target frame rate. Anyway, I've decided to not use antialiasing at all. I'm targeting computers used in educational institutions (so no fancy performance) and I need to maintain 60FPS through out. The antialiasing image effect kills performance (13FPS) and Chman's SMAA port for Unity also kills performance equally. It seems to me that WebGL is such novel technology that so many things are shaky. Oh, and because you've noted that it works for you, I'll keep it enabled just in case it might work on some target computers.
Ah, that is interesting to know. If the example link does not do AA, then, yeah, no AA for you in any WebGL content (Unity-made or not) - but you seem to have figured out the same. About "reving up to the target frame rate", that could be because the JIT takes some time to optimize the code. In Firefox, which has asm.js (where all the code is compiled ahead-of-time), that should not be the case. In the future, when we have WebAssembly, this should no longer happen.
At http://jsfiddle.net/59v3101e/17/ anti aliasing looks working good for me (latest version of 64 bit Chrome), but I can't get Anti Aliasing to work. Not through the Quality Settings, nor with an image effect. I am not changing AA at runtime, and the quality settings are 100% confirmed to have AA turned on. What can I be doing wrong?
Maybe worth noting: If you have set multiple quality presets (some of them with AA, some of them without AA) and you change between them with SetQualityLevel(x, true) ... after reloading the page the anti-aliasing setting of that quality level will be applied. So you can somehow hack together a working Antialiasing-Switch, using PlayerPrefs or LocalStorage to remember the new quality level so you can set it again (otherwise the default quality level will be set again after load and if you start the webgl app again later it will have your AA setting reset)
This worked great for me @dark_end thanks for that tip. Has anyone been able to get high quality aliasing with sprites? Or text rendering that is super clean?
Hi, not sure if this is the same problem but the behaviour I've observed since the launch of WebGL is that Quality-Settings AntiAliasing works as long as you do not use any image effects on your camera. Many (if not all, haven't tested them all) image effects such as Bloom or SSAO when applied to a camera cancel the Quality-Settings AA the moment they are activated in a scene! So for us this has been a decision we had to make from project to project: Is good AA more important or do we need image effects for our scene and can then fall back on the image-effects-based AA solution that produces much poorer results. Steps to reproduce: - Create a simple scene with some cubes in it - Attach for example SSAO to the camera but disable it - Use a simple script to activate SSAO in scene with the press of a button - Set Quality-Settings AntiAliasing to something like x4 Behaviour: Quality-Settings AntiAliasing works when the scene first starts but is instantly deactivated/overwritten/whatever once I press the button to enable the SSAO. Is this supposed behaviour, or is it a bug? Will we see a fix for that anytime in the future? Could someone clarify please?
This is indeed a bug with WebGL (see https://issuetracker.unity3d.com/is...ng-does-not-work-in-scenes-with-image-effects) but I can't say when we'll have a fix for it.
This requires support from the browsers for multisampled renderbuffers. WebGL 1.0 does not support this. WebGL 2.0 will in Firefox 47: https://bugzilla.mozilla.org/show_bug.cgi?id=1094458 (not sure if/when this will be supported in Chrome).
As far as I know PlayCanvas offers FX-AA together with bloom and others posteffects. In Unity both SMAA by Thomas Hourdel and AA from the CinematicEffects (Alpha) do not work. Why possible in PlayCanvas and why impossible in Unity?
Oh, that is a different question - you are not talking about multisampled rendering, but about AA done using postprocessing. I will need to look at each of those shaders to be able to answer that. Might get a chance to do that later this week, and will get back to you.
Thanks @dark_end for your tip! I don't understand why, but unity-made-WebGL always starts with AntiAliasing setted off... (I use Unity 5.3.4f1) I used the PlayerPrefs to save a flag the first time the QualitySetting was setted to activate AntiAliasing (in the Start method). Then, when I reload the page, it activates the AntiAliasing... @jonas echterhoff is there any way to set the default quality setting of unity-made-WebGL to the higest to start always with Anti Aliasing activated? I setted that in the Edit -> Project Setting -> Quality and using the code in the Start() and Awake() methods with QualitySettings.antiAliasing = 8; and QualitySettings.SetQualityLevel(5, true); ... But I have always to refresh the page to see the AntiAliasing activated.. Thanks in advice for any help about that!
Unity 5.5 docs contains updated information about built-in anti-aliasing support and its limitations: AA works as you would expect when using WebGL2.0 On WebGL1.0, builtin AA works as long as there are no Image Effects. but you should also consider the following: At this time, WebGL2.0 is enabled by default in Unity as of 5.5 (can be manually enabled in 5.4) However, your content won't use WebGL2.0 unless it's supported and enabled by the browser. WebGL2.0 can be enabled in Firefox and Chrome via flags WebGL2.0 will be enabled by default in Firefox 51 and Chrome 56 (current versions are 50 and 56 respectively, so that will happen soon)
I tried it today with Firefox 53 and Unity 5.5 - I see no changes here. Enabled the webGL 2.0 in player settings, switched on 8xMultiSampling in the Projects quality settings and a tonemapping color grading script at the camera. Firefox 53 has webGL 2.0 enabled (by default), so it should be okay? Still no AA, the same behavior as before. Did I miss something?
I have updated my project from 5.5 (or 5.4 - I am not sure) to 5.6.1 AA works in FF - as it did before the update, but doesn't work in Edge any more - has worked before.
@Marco-Trivellato, Can I patially enable AA and partially disable, for specific RenderTextures or cameras in WebGL? I need all of my RenderTextures to render without AA (blending artifacts) and finally rendering on the screen need to be AA (by final renderTexture or Camera).
For novices that might be going crazy with this feature. It is really simple, just edit the index.html file in the generated web template and remove the CSS that makes the canvas go on pixelated mode: image-rendering: optimizeSpeed; image-rendering: -webkit-crisp-edges; image-rendering: -moz-crisp-edges; image-rendering: -o-crisp-edges; image-rendering: crisp-edges; image-rendering: -webkit-optimize-contrast; image-rendering: optimize-contrast; image-rendering: pixelated; I wonder why this would be added in the first place in the default minimal template and maybe others. Maybe as a defensive measure for under-performing WebGL engines.