4x memory usage in Xcode GPU Frame Capture

I’m looking at the frame capture tool right now. I went to look at these heaps here and noticed they each store one texture, but their allocated size is 4x the contained texture size. For example, a heap will be allocated for 40 MB, but only contain a 10 MB texture. Or it will be allocated to 32 MB, but only contain an 8 MB texture. You can see this in the screenshot attached.

Any ideas why this could be?

We are trying to maximize memory use right now and this looks suspicious.

After further testing, this appears to be due to using dynamic resolution. It will allocate the memory required for 100% screen scaling, then the actual texture gets dynamically allocated to be whatever % you set the dynamic scaling at.

Is there a way I can force the memory to be lowered if I know the dynamic resolution will only ever be 75% or whatever?

Found a bug in the unity memory profiler. Look at the sizes of the textures in the unity profiler compared to the sizes xcode reports. The sizes unity reports are always smaller, than the ones shown in xcode, thus unity will report a lower memory usage than is correct.

I have one example selected in the screenshot, but if you study it, you will see a few examples of the texture sizes not lining up in those two screenshots.

@aleksandrk

What’s the texture size and format?

Lots of varying sizes. Not sure what you mean. Also those formats should for the most part be PVRTC or ASTC

I mean in that screenshot there are varying formats and sizes. They are all POT though.

I meant the texture that’s highlighted :slight_smile:

Ah hmm this one is actually a bit odd. It is 300x300, so not POT. I forgot that our UI textures tend to not be POT. Also PVRTC. But this is supposed to be on an atlas, so odd to me that this is using gpu memory separate from the atlas. Also we are using half res in the quality settings for textures.

Here is a screenshot:

It looks like it’s showing you the original texture memory in the profiler, which is around 350KB for a RGBA 300x300 texture.

Yeah good point. I noticed that after when I posted the image, but shouldn’t this show the size in memory? or else what is the point of using it to profile runtime memory?

I appreciate your help so far btw, thanks :slight_smile:

1 Like

I guess it’s difficult to show per-texture memory usage when a texture is in an atlas :slight_smile:
There might be a way to highlight that it’s actually part of an atlas, but that’s a question to the people working on the profiler :slight_smile:

What does the profiler show as total texture memory used? Is it different from what Xcode reports?

Yeah I could understand that if it didn’t list the atlas also. But both Xcode and Unity list both texture memory from the atlas and textures that should be in the atlas.

Xcode profiler in game shows 1.02 GB total, and texture memory in Xcode is 155.1 MB, however this also includes other textures unity doesn’t include, such as some temporary render textures. Unity ends up showing 0.79 GB total memory, with 190.6 MB for texture memory (from the built in unity profiler).

So it is all kinds of weirdness. This is the part that makes it really hard for us to debug unity memory usage on iOS.

6286868--695195--Screen Shot 2020-09-07 at 11.00.18 AM.png
6286868--695198--Screen Shot 2020-09-07 at 11.04.27 AM.png

Are those textures marked as readable by chance (the Read/Write Enabled checkbox)?

The UI textures are not marked Read/Write. The model textures are marked to use Streaming Mipmaps, but also not marked as Read/Write

Oh wait I take it back, those ui textures actually weren’t in an atlas. Now that I put them in an atlas, they aren’t shown otherwise. Although the memory discrepancy between textures remains…

OK, good, one part figured :slight_smile:

Regarding the discrepancy, how large is the remaining difference in the stats?

Memory discrepancy should be about the same as shown in the most recent comment with screenshots.

The Memory Profiler backend relies on Native UnityEngine.Objects to report their memory size through Profiler.GetRuntimeMemorySizeLong() which needs to be implemented per object and then needs to report the correct size. In some cases this may be doing some slightly wrong calculations of it’s size. Whenever such a discrepancy (like here where a platform specific profiler reports a different size than Unity’s Memory Manager) comes up, please report them as a bug, with the project (or at least a stripped/empty project with the offending asset with the right import settings getting loaded in in a scene so this can be reproduced).

Yeah I can definitely do that, but I find it hard to believe this is only from textures. There is a few hundred MB discrepancy in some cases. So it leads me to believe this is a lot bigger problem than some textures being off by 50 KB or so.

I’ve noticed other things where some textures for post processing aren’t showing up in the unity profiler, but do show up in Xcode. Unity seems to catch most of them, but at least one render texture in particular I couldn’t find a reference to in unity’s profiler, whereas I could in Xcode.

It’s extremely frustrating as I’m sure you can imagine, because I am forced to look at Xcode instruments to get an idea for where allocations are happening, and then to understand that, I have to have some amount of understanding of unity source code, or at least guess from the c++ method names (which is what I’ve been trying to do as of late).