How to interpret internal profiler data?

I am trying to track down performance issues in our game, and from crawling some forum threads I saw that there was an internal profiler you could enable.

But I have a hard time finding any information on how to read the numbers.

Anyone care to explain, so others might be able to use this too?

The numbers I get:

iPhone Unity internal profiler stats:
cpu-player>    min: 12.5   max: 17.0   avg: 15.3
cpu-ogles-drv> min:  1.4   max:  4.1   avg:  2.1
cpu-present>   min:  0.9   max:  2.9   avg:  1.1
frametime>     min: 30.5   max: 36.2   avg: 33.3
draw-call #>   min: 11   max: 11   avg: 11
tris #>        min: 278   max: 278   avg: 278
verts #>       min: 520   max: 520   avg: 520
player-detail> physx:  1.1 animation:  0.0 skinning:  0.0 render: 12.8 fixed-update-count: 1 .. 2
mono-scripts>  update:  0.1   fixedUpdate:  0.9 coroutines:  0.0 
mono-memory>   used heap: 2072576 allocated heap: 2740224

Thanks a bunch!

/Thomas

Have you seen this page: file:///Applications/Unity%20iPhone/Documentation/Components/Optimizing%20Performance%20on%20the%20iPhone.html

It tells you what all the numbers mean.

Damn - missed that one. Sorry.

Anyways - now everyone searching on the forums will find it too. Thanks!

Wow, how did I not know this was in our documentation this whole time? Thanks for pointing it out.

very valuable thread, thanks guys. I missed this in the docs and have been pulling hair out trying to figure out how to correctly debug some performance problems.

The internal profiler documentation does not mention what “batched” means…

Is it the number of drawcalls after batching or the number of drawcalls that are batched?

It means the total number of objects being batched. Example, fire up a new empty scene, add 4 cubes with material A, and you should see 1 drawcall, 4 batched. Adding 4 more with material B, you’ll get 2 drawcalls, and 8 batched.