Performance - Unpack Normals vs. Texture Swaps?

Before sinking any time into benchmarking this myself, I thought maybe someone here already knows the answer. Scenario: hundred of objects in the scene with only 2 textures total in use - one for albedo, one for normals, with each object using whatever parts of the textures based on geometry UV and/or shader offset/tiling.

As there is an expense to each texture swap that’s necessary for the render (at least for mobile / low-end devices, if not everything), we generally use a sprite sheet / atlas to eliminate that hit wherever possible aside from whatever other benefits. However, there is a problem in the above scenario. As these objects are sampling from both textures, what could happen is that for let’s say 100 objects, there could be a texture swapp up to 200 times per frame!

We can’t [generally] have both albedo textures and normals in the same sprite sheet. Normal textures are imported with a different flag and in the shader we specify “Normal” type for that texture input and possibly other differences I’ve read of but I haven’t confirmed.

So given a worst-case scenario of 200 swaps for 100 objects, wouldn’t that have a significant performance hit? What if we use an Unpack Normals node in Shader Graph to allow normals to come from the albedo sheet, or even just multiply color by 2 and subtract 1 to get it pretty similar (i.e. changing color range from 0 to 1, to -1 to 1), thereby allowing all of the albedo and normal textures to be on the same sprite sheet? Then we’re down to absolutely ZERO texture swaps each frame for all those objects, not counting when it first swaps to it.

But what kind of a hit will Unpack Normals (or a lower-quality alternative) have, compared with the texture swapping cost?

Does anyone know if this is worth looking into further, or it’s pointless, or ?

Color textures are in sRGB color space and normal maps are in linear, so you’d have to also include the correction.

1 Like