All I want to know is how long it took the GPU to render a given frame. That’s it. I know that Unity has API code for this. I’ve seen the class definition for that code. UnityEditor.Profiling.HierarchyFrameDataView.frameGpuTimeMs
What I don’t know is: How do I get my hands on an instance of the HierarchyFrameDataView that contains this one piece of information? The only think I need is a float (or double) that tells me how long it took the GPU to render the previous frame, within my own code. Not the UI. Not the full blown profiler tool. Just within my own code.
I would rather use FrameTimingManager, but that piece of s**t doesn’t work on Windows Platforms.