I really never understood these check boxes when building my project and what impact they have on performance or when to check them or not. I been using both ticked currently, but wish to know more.
From the manual: Use 32-bit Display Buffer Specifies if Display Buffer should be created to hold 32-bit color values (16-bit by default). Use it if you see banding, or need alpha in your ImageEffects, as they will create RTs in same format as Display Buffer.
Does not tell me much, can anyone explain these check boxes to me please.
On iOS you usually have a 16-bit colour buffer. (I can’t remember if that means 5-bits for R and B and 6-bits for G or not.) Reduced bits for colour channels can introduce banding, in which case you probably want 32-bits for colour, 8-bits for each of R, G, B and A. Since a 32-bit colour buffer is bigger than a 16-bit one, twice as large, you’re spending more of that iOS memory on the colour buffer.
Similarly, a z-buffer is 16-bits by default. If you need more precision, use a 24-bit one. That’ll use more memory.
considering we are speaking now (summer 2014) - i would say both 32bits color and 24bit depth should be always on - it doesnt make sense on newer devices, and on older ones you will have bigger problems than that (performance wise)
as for visual quality - well, obvious
Reminds me of the old TNT vs Voodoo days. TNT ended up being just as fast in 32 bit (eventually). That time is pretty much here for mobile too.
Now textures are another story. There’s a lot of situations where 16 bit textures make a big difference to your ram. But I am not sure if that is always a difference in ram on actual device - perhaps Unity can clarify. In any case, They have limited use without better options for dithering.
I remember the “32 bit depth buffer” option was always my first go-to for solving android issues, i.e. sometimes disabling it would solve my problem, sometimes the opposite. So I was wondering if there’s anything I’d need to know about it, especially now, since so much time has passed. ATM, enabling the option solves some issues for my main project.
Also, no, phones don’t have enough ram. My game is still killed by the activity manager when put in background, this on a 8gb ram phone with no apps in background, so, at least for a temporary fix, disabling this option could save someone’s day
What was discussed here is related to the display buffer, so it only concerns the screen color. I don’t think there is a similar setting for depth buffer in Android Player Settings, there’s only “Use 32-bit Display Buffer”. Also, I don’t think there was anything indicated in this thread they need updating. If it was about the “16-bit texture” in general, the answer is yes, usually when a copy of it is needed on the CPU side.
Have you profiled to see what parts of your game are using more RAM than you expect? I would doubt reducing just the screen buffer bit depth would be enough as it only saves a dozen MB at best.