Unity Web Player timing is always different (Resolved... explained)

I am dividing the time passed each frame by (1/60) and then using the resulting floating point value (e.g. 1.234, 0.894) to modify values that are occurring on each update. I even created an animation that rotates at a specific FPS to music and watched it, forcing the framerate down and up and it never lost its synchronization.

I built to the Unity web player, opened in Internet Explorer and the animation timing is all wrong.

Guys, I have to say… I think it’s a lost cause.

.exe is slower, web player is faster. I don’t get it.

It is close enough, but, still quite tragic.

Top number is the actual frame rate, derived from smooth delta time,
bottom number is the ratio I am using to try and balance the time.

Left is standalone, right is web.

The web player runs faster…

EDIT

It really was displaying differently in different windows. It actually was just the animation on my wheels. Everything else moves lazily about (in space) and so I couldn’t tell the relative difference, but the wheels… they were what was throwing me off. It was the only real constant thing I could really see was WAY faster, like exponentially faster… but also several factors slower when the frame rate was faster.

What had happened was…

I was setting the framerate of the animation from the horizontal input axis and applying the framerate fix to that, then down below where I apply the framerate and progress the animation I was applying it again. Since I had only tested it in one environment with a fairly constant framerate, I never noticed the bug.

Thanks everyone for the replies!

I guess the webplayer doesnt sync up to 30fps. Even triple AAA developers have the same problem, see the arkham knight that went to shit when they tried to make it 60fps

The fact that I’m close makes me feel good, then.

I haven’t seen any cases where using Time.deltaTime properly resulted in any differences based on platform. The web player executable is basically the same Unity engine code that runs in the standalone. You should never rely on the framerate being exactly 60 (or any other number).

–Eric

Forgive me for asking, but how would one use it properly?

Have you tried simply using Time.deltaTime instead of calculating the differences between frames yourself? Your math is assuming that the frame rate is exactly 60, but that won’t necessarily be the case.

I haven’t this time, I had some issues with it before. Specifically with large numbers of particles being fired one after the other. There would be speed differences in individual projectiles if there was a framerate change.

EDIT: Just replaced my value with Time.deltaTime
Unless I’m doing something wrong, time.deltatime has the same exact result.

Not sure what I’ve missed.

No, my math is assuming that the framerate will be unknown…

And that I will determine the actual time between the last update and the current update and find the rate of that number to my desired frame rate (in this case, 60). That’s why when things run at 60fps my framerate correction is 1. If it was running at 120 fps my adjustment would be approximately 0.5. Obviously it’s not perfect, though, it’s 1.02 above when it should be less than 1, but I think that’s a rounding issue.

I’ll keep reading and messing with it. But I’m definitely getting weird results and I’m not sure if it’s me or not yet.