Windows UPS (update per seconds) always capped to 65 (Windows 7 or XP, Nvidia or ATI)

Hi All!
Unity 3.4, tested on Windows 7 64 bit, Windows XP, NVidia cards, ATI cards.
I made a simple FPS (UPS) counter and i realize that on Windows (VBLank disabled) Unity does always around 65 fps. Tested the same Project on Mac: 200-300 fps.
Why?

That is the script:

using UnityEngine;
using System.Collections;

public class FPScounter : MonoBehaviour
{
    private float elapsed;
    private int counter;

    void Start ()
    {
        counter = 0;
    }   

    void LateUpdate ()
    {
        elapsed+=Time.deltaTime;
        counter++;
        if (elapsed>=1.0f)
        {
            elapsed=0;
            Debug.Log(counter.ToString());
            counter=0;
        }
    }
}

That’s a good point! I run this in my PC (XP Intel Core Duo 1.8GHz) and the result was the same: 64 to 65 fps. But, when I clicked on the Stats button over the Game view, Unity displayed about 2.0mS / 500 fps!

I then switched the sync mode to Every Blank: your FPS = 61, Unity = about 14mS / 70fps.
Changing the sync mode to Every Second Blank: your FPS = 31, Unity = about 30mS / 33fps.

It seems that Unity measures the time taken by the scene rendering process and displays it in the Stats panel, and the fps it shows is simply 1/rendering time - just to give us an idea about the frame rate. Thus, the fps counted does not mean the Win Unity is slower (it seems to be faster in this case: 500 fps in my slow machine versus 300 fps in your Mac)

I suppose the mismatch between PC and MAC Unity versions is due to hardware differences: while MACs are uniform and predictable like a nazi parade, PCs look more like a Frankenstein army - each PC is an unpredictable combination of pieces of different brands and models. Maybe the free running option in PCs rendered headaches to the Unity team, so they decided to fix the framerate in the Windows implementation.

You are right. FPS measured in Stats are only relative to the rendering process. It’s not a real fps indicator. Unity measures the time taken by all your renderings and calculates a theoretical framerate based only on it.
I’ve used many different engines on pc and this is the first time i see something like that.
It’s not really a problem, but i’d like to have the same behaviour on PCs and Macs.

Same thing here. Need to know why!