Hi everyone,
I have had this issue ever since I started working with Unity about 2 years ago. It has followed me through numerous computers, with different brands of graphics cards as well as driver revisions.
I generally set all my graphics settings to “Fantastic”, but for some reason these never work when in play mode or during my builds. I can see them in the “scene” view, but when i switch to Game, its back to basics. No AA will show and you can tell the lighting/shadows/shaders are not the same. The one exception after hours of fiddling that I have found to help, is if i manually set the graphics emulation to Shader model 2.0, then the graphics settings will update in my editor Game view to correctly show the AA level as well as other settings that should be much higher than what they are. This is unfortunate because I would really like to let the Unity default to the highest ShaderModel that is on the current hardware, and then adopt these “Fantastic” settings and use them correctly.
The current graphics card I’m encountering this problem on is an nVidia 570. The other card is an ATI 4870. Both have up to date drivers. Not that this really matters because its happened throughout the last few years.
Am I really the only one having this problem?? What am I missing? Some setting? Generally I build to webplayer if that has anything to do with it.
Thank you guys very much for your help. Let me know what you think about this!!