In Unity 2017, how would one determine performance for different material/mesh/texture set-ups? Is there a built in tool for this? Is it a pro-only feature?
Which of the following would be more efficient use of CPU/GPU capabilities?:
-
Assume a mesh object has 100 verts. Display 100 mesh-objects, each has a unique material and a unique texture. That’s 100 materials and 100 textures over 10,000 verts.
-
Assume a mesh object has 100 verts. Display 100 mesh-objects, each has 5 materials and 5 textures, but many are shared. That’s 20 unique materials and 20 unique textures over 10,000 verts.
-
Assume a mesh object has 20 verts. Display 500 mesh objects, each has 1 material and 1 texture, but many are shared. That’s 20 unique materials and 20 unique textures over 10,000 verts.
The performance depends on the amount of vertices, the amount of gameobjects these meshes are distributed over, and the amount of draw calls.
For rendering a gameobject the engine has to issue a draw call to the graphics API, as the number of draw calls increase, performance decreases.
Every unique material uses a draw call aswell, so if you have many different materials, that means that there will be more draw calls. It’s the same thing with textures, using just one texture file is far more efficient than many different texture files, which will each require a draw call.
We could say that option 2 is most efficient, as there is the least amount of gameobjects, unique textures, and unique materials.
You can test this out for yourself using the stats in the game window, here you can see the FPS(which is not Always accurate). and the amount of draw calls(batches). If you want to have even more control you should use the profiler, which is more accurate.