How to achieve the best latency between Inputs (touches) and outputs (mainly Audio)

Hi !

I’m working on a Musical instrument app (Unity + iOS).
I’m trying to achieve the best latency between the user touch and the audio actually playing (and, graphics of course).

Latency is a cumulative of the following steps (correct me if something is wrong or incomplete) :

  1. hardware touch detection
  2. OS event detection
  3. Unity process (Input.touches)
  4. App process and logic
  5. Audio output (OS)
  6. Audio output (hardware)

I worked on the step 5 (and 4), rolling my own Audio Unit with a very short latency (2 ms buffer without clicks, 1ms still works but with clicks and artifacts). See http://forum.unity3d.com/threads/202526-Very-low-latency-audio-output-on-iOS

Steps 1 and 6 are far from our reach, we cannot do anything about it.

But the main problem is that Inputs seems to be handled and processed in sync with the main loop framerate. So, working at 60 FPS (in the best case), we still have this 1/60th s (17 ms) latency (at step 3) adding to the others.

How can we optimize step 3 (and possibly step 2) ?
In Classes/UnityAppController.mm :
We can find a

#define kInputProcessingTime                    0.001

But this doesn’t seem to be used anywhere.
Inputs seems to be handled in the repaint() :

- (void)repaint
{
    EAGLContextSetCurrentAutoRestore autorestore(_mainDisplay->surface.context);
    SetupUnityDefaultFBO(_mainDisplay->surface);

    CheckOrientationRequest();
    [GetAppController().unityView recreateGLESSurfaceIfNeeded];

    Profiler_FrameStart();
    UnityInputProcess();
    UnityPlayerLoop();
}

So, yes, as we see Input are processed in sync with the rendering, so at 60 FPS in the best case.

Any ideas ?
ideally, I would like to be able to reach 10ms latency, as this is known as the highest latency where you feel “no latency”.

Thanks for any feedback !

I like how you have assessed and laid out your issue. My suggestion, however, is in no way technical, but illisonal, if that makes sense. I am aware the mind can be fooled quite easily. Look at any magician. My point is this…

First a question… Is the latency time, for any input static? Or is the latency more, when triggering an audio class to play Vs recording a position?

If it is not static, perhaps a visual, like turning on a renderer, which wil catch the eye, distract the viewer per say, and then only 10ms away, the audio plays… Would this type of trick, have any real impact? Perhaps.

So again, while the delay may not be improved, the presentation, may.

Yes, that could be an idea, but I really need a true lower latency, because this app is a musical instrument that has to be played live, so any delays between the user touch and the sound output breaks the interaction.
And yes, the latency is static. No loading during play (all the sounds are preloaded) to ensure the best delay. But still not enough :slight_smile:

More specifically, is there any way to make Unity iOS process the Inputs more often than at 60 FPS ? (best case currently)
Before each FixedUpdate() would be great, allowing the app to be very audio-responsive !

Any ideas ? as I can see a lot of the main loop timing options disappeared recently (“Tuning Main Loop Performance” section in the doc) :
http://docs.unity3d.com/351/Documentation/Manual/iphone-Optimizing-MainLoop.html
Display Link, NSTimer, Thread, Event pump, …

So is there any thing to do to have a best rate in the input processing on iOS ?

I compared Input.GetTouch() and Input.touches, and this seems to be exactly the same (in the doc they say that one is the last frame, and one is the current, that doesn’t seem to be the case anymore).

Up !

Is there any hint to have the inputs processed faster than the rendering loop ?

i don’t think that’s possible. That’s the reason why a slow frame rate means controls automatically become unresponsive.

Yes, but in my case the framerate is capped to 60 FPS not because of the “bad” performances, but because of the VSync. So I thought that maybe I could have a way to process the Inputs not in sync with the rendering.
Because graphically, a 60 Hz refresh rate is ok.
But when we come to audio, a 1/60s latency is not that good.

Agreed. I do think they are directly linked though.

Bump - did you find a way to have an acceptable sound latency (as in a musical instrument)? I am trying a lot of stuff for days and no help… Thanks

Hi jonlab – have you tried turning off VSync? (In Project Settings > Quality).

On iOS, vsync is forced regardless of the setting.

The problem of musical instrument app came from 2 things : Unity adds audio latency and ALSO input latency. Your sound is a result of tapping the instrument. Audio latency is one thing but input latency also indirectly increase the perceived audio latency. (You can look at my research here : GitHub - 5argon/UnityiOSNativeAudio: This project confirms that the Unity's audio problem is not just audio latency, but also input latency.)

To fix audio latency :
Project Setting > Audio > DSP Buffer Size > set it to Best Latency (small buffer size). As of today with this settings, it make a glitched sound on Windows build while on macOS, Android, iOS is completely fine. You might want to have larger buffer size on Windows. (at the expense of more latency)

If that is not enough you can use native methods of each platform. I just made Native Audio asset store plugins which can make a native call to both iOS and Android’s fastest native way from one central interface. Unity Asset Store - The Best Assets for Game Making

There are various ways of playing audio at native side, here’s my choice :

  • On iOS it uses OpenAL. It is faster than AVAudioPlayer, AudioToolbox, and SystemSound.
  • On Android it uses AudioTrack, I confirmed it to be faster than SoundPool and no meaningful difference from C++ side OpneSL ES of NDK.

I have compiled all of my findings in here : Native Audio - Unity Plugins by Exceed7 Experiments
PS. I have used FMOD for Unity before. The best settings that I could do. In addition to setting the best file format, requires editing FMOD Unity’s source code to use very low number of buffer size. With that still the latency is just about equal to Unity’s “Best Latency” (ant the sound cracks more too due to a low buffer size)

To fix input latency :
This is much more difficult as the path that Unity receives touch from Xcode project is almost hardwired and is not meant to be replaced easily. (Unlike audio, we just left the Unity one and use our native method)

I made iOS Native Touch which can reduce this input latency. But you will lose many conveniences that Unity provides including finger ID tracking, stationary state, etc.

http://exceed7.com/ios-native-touch/