Low latency for music based game

Hi,

I’m building a music/rhythm based game that requires players to tap drum and other sounds, while keeping time. This becomes very hard even with a little bit of latency. I’ve tried the ‘best latency’ option in unity, which does improve things a little, but even this combined with several permutations of compression/quality settings etc… have not helped. I’ve been researching on using an external audio engine, that is capable of achieving these ultra low latency levels, but I can’t find any proper tutorials. I found this engine called Superpowered, but again I’m really confused about how to set this up to work with unity. I’m hoping someone here can help me or point me the right direction to setup low latency audio. I’m really stuck at the moment as I cannot proceed with the game unless I solve this issue.

If you’re not targeting Android, latency it generally pretty good on that setting. Make sure you aren’t trying to play mp3’s for sound effects, as those take about 20 times longer to start playing. Play only wav and ogg files without streaming.

Superpowered, no idea how to use it so I’m weary.

@jerotas thanks for the reply. Since it’s a music game where one has to keep time even the iOS latency is quite noticeable.

BUMP
Is there anyone on these forums that can help me setup superpowered with unity?

Hello, I have experience trying to setup Superpowered. Their website is informative about audio latency but what’s not obvious is that the most important thing that make it work is their special OEM audio server. They run a business which you could order a custom android mods with their findings built in and that would work with their library. Imagine you want to build an interactive desk with Android inside. If you order a hardware with modifications from them you could make a sick music application. But this is not going to work with normal Google Play distribution since everyone else have just a normal phone.

You can still use a free Superpowered library, but it just provides general audio tasks without much improvement. (But still an improvement if you compared with Unity audio, since it interface with Android library.)

Next, I am also developing a music game so I can feel your pain. I have researched much into ways to make the game the most responsive.

Your problem came from 2 things : Unity adds audio latency and ALSO input latency. Your sound is a result of pressing a button. Audio latency is one thing but input latency also indirectly increase the perceived audio latency. (You can look at my research here : GitHub - 5argon/UnityiOSNativeAudio: This project confirms that the Unity's audio problem is not just audio latency, but also input latency.)

To fix audio latency :
Project Setting > Audio > DSP Buffer Size > set it to Best Latency (small buffer size). As of today with this settings, it make a glitched sound on Windows build while on macOS, Android, iOS is completely fine. You might want to have larger buffer size on Windows. (at the expense of more latency)

If that is not enough you can use native methods of each platform. I just made Native Audio asset store plugins which can make a native call to both iOS and Android’s fastest native way from one central interface. Unity Asset Store - The Best Assets for Game Making

There are various ways of playing audio at native side, here’s my choice :

  • On iOS it uses OpenAL. It is faster than AVAudioPlayer, AudioToolbox, and SystemSound.
  • On Android it uses AudioTrack, I confirmed it to be faster than SoundPool and no meaningful difference from C++ side OpneSL ES of NDK.

I have compiled all of my findings in here : Native Audio - Unity Plugins by Exceed7 Experiments
PS. I have used FMOD for Unity before. The best settings that I could do. In addition to setting the best file format, requires editing FMOD Unity’s source code to use very low number of buffer size. With that still the latency is just about equal to Unity’s “Best Latency” (ant the sound cracks more too due to a low buffer size)

To fix input latency :
This is much more difficult as the path that Unity receives touch from Xcode project is almost hardwired and is not meant to be replaced easily. (Unlike audio, we just left the Unity one and use our native method)

I made iOS Native Touch which can reduce this input latency. But you will lose many conveniences that Unity provides including finger ID tracking, stationary state, etc.

http://exceed7.com/ios-native-touch/