Real-time factor (RTF) in Unity?

Hi all. When robotic simulators like Gazebo, V-REP, or Webots are benchmarked and compared in academia or research, real-time factor (RTF) is used as one of the primary metrics. This factor is usually easily accessible in the simulators, or they report it to the user directly during simulations. Is it possible to get the real-time factor from Unity simulation so that I can compare its performance with other simulators?

RTF = simulated_time / real_time. If RTF > 1, the simulation could run faster than in real-time. If RTF < 1, the simulation runs slower than in real-time (e.g., your hardware takes longer to compute the physics, or you're doing some complex fluid simulation).

Or the other way around. If I wanted to compare Gazebo and Unity, would it be correct to multiply Gazebo's FPS with RTF to get a scaled FPS that I could compare with Unity's FPS?

Because Unity always stays in real-time, decreasing FPS if the physics engine is struggling to keep up. But Gazebo runs in two processes - the simulation and the GUI. And the GUI's FPS is only affected by mesh complexity, but when the physics simulation struggles, Gazebo will decrease the RTF, and you'll see this in the GUI as if the simulation time slows down, but it has no effect on the FPS.

Comming from the game engine industry Unity is working the other way round: Not the RTF is managed but some physic calculation is skipped to keep the game running. So from my point of view Unity cannot be compared in such a way.

FPS is reflect the rendering speed of Unity. In your case, I think you should look at FixedUpdate. All the physics properties and function is updated with FixUpdate function. The default deltatime of FixUpdate is 0.02 seconds. But it related to how complex the computation is.