Unity FPS / Frame counters all lies!!! (at super low values)

NOTE: Every Single Basic Unity FPS Counter is Inaccurate At low FPS (With Default Time Settings)

[[[[ UPDATE: This was News for me after 3 years of Unity development. Update(), etc DOES NOT RUN per frame draws. By default it runs on frame draws OR similar to fixedUpdate 3x per second even if actual FPS =1. This surprised me and invalidates the low-end accuracy of FPS counters. ]]]]


I am simply trying to display an ACCURATE Frames per second to my players. Nomally our project runs 60fps on almost everything, I AM NOT asking for how to speed it up.
But some of the users are desperate and willing to play on old hardware. As such, I need to make it clear whether they are getting 1fps or 3fps, As there is a huge difference. and this info helps them select better settings, etc. to hopefully get into a usable 10-15fps range

PROBLEM: Unity has minimum timestep! default= 3fps. ALL Update() etc. are called 3 times per second even if screen doesn’t actually update. I would prefer to LEAVE the minimum timestep as it prevents many tricky errors when deltatime>1.

Unity’s STATS display correctly shows 1fps. if I set maxFramerate to 1

HOWEVER: the following ALL still run at 3x per second even though the screen CLEARLY only renders once/second:

Update()

OnPreRender()

OnPostRender()

WaitForEndOfFrame()

Time.totalFrames

ALL of these sources show 3 fps rather than the actual 1fps!!

Anyone have a proper source to know the ACTUAL rendering of an actual frame?

Right now, my users are tweaking settings and seeing no benefit because it will say their FPS is 3 whether it is 3 or gasp .25 (which i want to properly display) .
-Joe

the original Atari in 1977 ran at 60 fps. with a processor thousands of times slower than the average processor today.

if you have an update function that is running at three time per second you have something very very wrong with your code that needs to be fixed.

anyways, let me explain somethings.

code within the function update brackets simply happen as fast as the the user’s machine allows.
a fast machine process this faster and a slow machine calls it slower. delta time is the measurement of time this is currently happening so if you want things to happen once per second or at any specific frame you can use this number or other other counter techniques to limit things on a time basis so every device processes the code evenly. there are also timers in the C# language that will trigger code at even time intervolds.

in game development i agree that it is often good practice to make some games to run on slow machines and make things “fair” for all. but I don’t think the setting you are talking about is what you want unless you want super choppy jittery movement for everyone.

if you could give more detail about what you are tryng to do, or where your code is failing on slower machines, we can help.

if you really really really want to slow down your frames per second in update for some reason . this should mess things up for you.

	void Update () {

		float tt = Time.time;

		while (tt+1.0f >Time.time) {
		// im freezing your entire game till one second passes
		
		}
// do stuff here
}