Hi !
I’m working on a Musical instrument app (Unity + iOS).
I’m trying to achieve the best latency between the user touch and the audio actually playing (and, graphics of course).
Latency is a cumulative of the following steps (correct me if something is wrong or incomplete) :
- hardware touch detection
- OS event detection
- Unity process (Input.touches)
- App process and logic
- Audio output (OS)
- Audio output (hardware)
I worked on the step 5 (and 4), rolling my own Audio Unit with a very short latency (2 ms buffer without clicks, 1ms still works but with clicks and artifacts). See http://forum.unity3d.com/threads/202526-Very-low-latency-audio-output-on-iOS
Steps 1 and 6 are far from our reach, we cannot do anything about it.
But the main problem is that Inputs seems to be handled and processed in sync with the main loop framerate. So, working at 60 FPS (in the best case), we still have this 1/60th s (17 ms) latency (at step 3) adding to the others.
How can we optimize step 3 (and possibly step 2) ?
In Classes/UnityAppController.mm :
We can find a
#define kInputProcessingTime 0.001
But this doesn’t seem to be used anywhere.
Inputs seems to be handled in the repaint() :
- (void)repaint
{
EAGLContextSetCurrentAutoRestore autorestore(_mainDisplay->surface.context);
SetupUnityDefaultFBO(_mainDisplay->surface);
CheckOrientationRequest();
[GetAppController().unityView recreateGLESSurfaceIfNeeded];
Profiler_FrameStart();
UnityInputProcess();
UnityPlayerLoop();
}
So, yes, as we see Input are processed in sync with the rendering, so at 60 FPS in the best case.
Any ideas ?
ideally, I would like to be able to reach 10ms latency, as this is known as the highest latency where you feel “no latency”.
Thanks for any feedback !