Are there any known issues with this? I’m having trouble keeping audio synched with animation in the web player at least on Mac. It synchs fine in the editor and in standalone as far as I can see.
Maybe there is some workaround for now? It’s almost like audio.time is running at a slower speed than the actual playback.
Can you be a bit more specific in what you are doing? I have encountered some limitations in various time rates not being equal to that of audio.time, but this kind of drift is apparent in anything that is not well-synched. If you trigger parts of your animation based on audio.time values, you might be safe*. But, given an animation and audio clip of the same length, starting them at the same time and expecting them to end at the same time, without any further help, is not a realistic goal in anything I have ever heard of.
*One of the limitations I am referring to: trying to play audio clips based on audio.time positions is not realistic either. Humans are good at differentiating the attack times of sounds down to around 5-10 milliseconds. Since we’re limited to Update() and FixedUpdate() without plug-ins, unless you can guarantee your game is going to run at about 200 fps, the timing of your triggered samples may be noticeably bad…
However, eyes work at a much slower “framerate”, with about 40-50 milliseconds or so for differentiating the beginnings of motion . If you can get a short animation to be triggered in an Update() function, based on audio.time, that will probably look fine, given a framerate of at least 10-15 per second.
After more testing, this works fine in the editor, Mac Intel standalone, Windows web player, but isn’t even close to working in the Mac webplayer for some reason. (The rest aren’t sufficiently tested.) And I even tested this feature during the beta. Not well enough.
I’ll just have to file a bug on it and live with it for now, I’m pretty sure. And just blindly blame Apple’s OpenAL for the heck of it.
First, I ask why you are not using audio.time as a master clock. And if your goal is to set gameTime to audio.time at the beginning of every frame, why don’t you just explicitly set
gameTime = audio.time;
And I’ve definitely had issues with audio.time that were Mac webplayer only myself, if that’s any consolation.
Because that didn’t work very well. Every few seconds gameTime would stutter, so the notes wouldn’t move for a frame or two, or worse jump backwards a bit. Smoothing it out got rid of that.
Maybe it’s because the audio’s on a different thread. But yeah that’s what I tried initially and ideally it should work.
The playhead can get broken and stuck, at the end of the slider, but only on the Mac webplayer. There are other issues that may be related to audio.time, but I’m not sure about those yet.
I’d be interested to see what you’re doing exactly that causes problems. I stopped working on my own rhythm game for now, and although it was impossible to trigger samples accurately with Unity, everything else worked fine, including, most importantly, recognizing if timing was accurate.
I was using an array of floats, for times that the player should be hitting a button, based on the rhythms i wanted the player to perform. Comparing those floats to audio.time worked great. Here’s a really terrible prototype that I made from love: