We are creating an app that utilises and relies heavily on the deffered rendering lighting system, we are developing on a windows system although will be building for both mac and pc.
On my mac pro running XP with an Quadro card I have no problem in both the editor and the pc build, however my fujitsu siemens laptop (Amillo?) with and ATI mobility radeon X1800 (used on a night time) also running XP, the deferred rendering works great in the editor, but not at all in the PC build? is it possible that the emulation is better in the editor than possible in the build? I’m probably not understanding something about SM3 and emulation, but need to get a handle on the specifications required for our application release.
Can anyone shed any light? (didn’t release the pun, apologies
)
Its possible that you just have lower quality in the editor than in the build, the x1800 is an extremely weak card to use with deferred actually. Doubt it will work with quality beyond good (on default quality settings)
Thanks dreamora I hadn’t thought of setting lower, I actually upgraded unity’s standard Good settings to get what I thought would be slightly better results, so this obviously didn’t work when running, however when I dropped to simple, everything was fine, It might have been the AA I upgraded which is a known problem.
Many thanks for this and while I’m at it thanks for all your other responses to others over the last few years which have certainly helped me along the way.
Glad that helped.
AA should not have an impact as it does not work on Deferred (normally it would raise VRAM and fillrate usage)
But chances are that the number of textures it needs to generate etc cause it, especially on higher shadow ranges the hit should become pretty beefy.