Feel free to post your questions before, after and during the session. Unite Now Session
Unity evangelist @hdgam3r shares real-world best practices for optimizing game performance. He’ll provide useful tips on Graphics, Audio, and UI, so you can get the most out of Unity and your game.
A team of product experts from across Unity will be available during and after the session. Our Community team will continue to field questions to foster an ongoing discussion with the community. Please feel free to ask your questions in this thread.
Some basic rules
Only questions related to the topics of the session are permitted.
All questions will be fielded by our Community Managers (@LeonhardP and @AskCarol)
Replies will have to be approved by the moderators to show up in the thread.
Once approved, the questions will be forwarded to the relevant experts.
We really look forward to hearing from the community. Thank you!
I’ve experimented with this a while ago and I found some issues that I wasn’t able to solve. Perhaps you’ve solved them.
Question 1
How can I render a native resolution UI on top of the lower res 3d scene? Just using Screen.SetResolution affected both in my tests.
I then started to come up with a custom render setup. I kept the screen at native resolution, but rendered the 3D camera to a lower-res RenderTexture. I then stretch-blit the lower-res RenderTexture and draw the UI afterwards.
This did work, but 1. didn’t make it faster and 2. caused all sorts of Physics screen to world issues.
Question 2
I noticed using Screen.SetResolution to lower the resolution caused on some android devices actually performance problems. Or let’s say it made it slower and not faster.
It’s been a while and I can’t remember why. I guessed Unity perhaps just renders to a lower-res RenderTexture and then stretch-blits it to the native-res frame-buffer? Like the setup I described above. Is that the case? Have you seen such worse performance on some devices too?
Use built-in rp. Render “main” camera to a render texture with a resolution of your choosing, render UI camera on top normally without any post. This is still viable, even though it is becoming slower to do in later version of Unity (maybe to prove their point that stacking cameras is slow?).
With that said, if you are mindful of your fillrate (which generally means avoiding any Unity provided shaders and post effects), you can render at native resolution everything just fine, with AA on even. Just turn off as many Unity features as you can.
Then you were not fillrate bound? In fillrate bound devices (like say, Samsung Galaxy S6) limiting the resolution for the main cam was a clear win for us. If you are not seeing performance gains, then you were not fillrate bound, which means lowering resolution does little for you (and is why “don’t use native resolution” as general advice is not very useful)
On ultra low end devices (like granny Mali 400), blitting off-screen can have a significant overhead, which can outweigh the cost of of rendering at native res. But those GPUs are usually far too slow to run any effect that would benefit from a lower resolution anyway.
The advice is valid for more modern, but not premium GPUs, specially if your game is in the business of using per pixel lighting, shadows, and post effects.
Mobile games which target “console like” graphics simply cannot do native rendering even on the very best mobile GPUs, because those are often paired with ludicrous high resolution screens.
Using dynamic resolution on your camera and setting your UI canvases to overlay did the trick for us: 3D rendering resolution can be changed at any time and UI is always at full res. Works like a charm on consoles too.
Why are you recommending Low Quality texture compression? My understanding (which may be wrong) is that the textures will be the same disk/file size but that the texture compression processor will have different settings set on it depending on the quality setting. Certain quality settings may use an unsupported texture compression (BC7 vs DXT5) but I am not aware of why the texture compression would matter in any optimization other than editor import times as long as the target compression is supported on the target device.
Disk space is saved if you have a more aggressive (Low Quality) and on mobile devices (sorry for the generalization) runtime performance is also a benefit you can get by having fewer bits per pixel stored. But as you mention, it’s not a silver bullet because it depends on formats, platforms, etc.