Been dealing with an ongoing issue with sound not playing on some iOS devices. I’ve been writing this off as device specific or user error but my boss insisted I looked into it.
I found this thread from 5 years ago;
and it looks like I was partially right.
The TL;DR is that I need to set the Audio Sessions Category to AudioPlayback (which is the correct setting for our app - sound only plays when the user is actively looking at an Augmented Reality target image)
Has Unity put this functionality into the program yet - or do I still need to write a plugin to do this? Can someone provide such a plugin or talk me through in simple terms how I would do this? I have no XCODE knowledge - it’s a huge black box to me I’m only vaguely aware that I can stick .mm files in the code and they’ll copy over but I have no idea how such a file should be structured.
Thanks for any insight you might be able to provide!
Otherwise, if it is not that important in the startup (if you have no problem with muting other audio sources or something like that) you can write your own plugin. Here is the snippet I made once for my app:
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
void muteOtherSources(BOOL mute)
{
AVAudioSession *session = [AVAudioSession sharedInstance];
NSError *setCategoryError = nil;
if(mute)
{
//NSLog(@"Muting other sources is true");
if (![session setCategory:AVAudioSessionCategorySoloAmbient
error:&setCategoryError]) {
// handle error
}
}
else
{
//NSLog(@"Muting other sources is false");
if (![session setCategory:AVAudioSessionCategoryAmbient
error:&setCategoryError]) {
// handle error
}
}
}
// When native code plugin is implemented in .mm / .cpp file, then functions
// should be surrounded with extern "C" block to conform C function naming rules
extern "C" {
void _MuteOtherSources(bool mute)
{
muteOtherSources(mute);
}
}
Where is Unity App controller. Mm? I didn’t see this in the project settings. Is this something in the final XCode output? If so doesn’t that mean I have to change it literally every time I make a new build?
I was doing that for info.plist keys too for a bit before I stumbled on a way to get them to build automatically.
Thanks for this info regardless it might help me get going in the right direction regardless. I’m not at the computer today but maybe when I look at it again when I get home what you wrote will be self-evident
Yes, that is in the XCode project. What I do, is just copying the file once into my unity project, applying the described changes and then hooking to PostProcessBuild (Unity - Scripting API: PostProcessBuildAttribute) and there I replace the gerated file with the file in my unity project. This is okay to do, since the file does not really change that often.
Okay I tried the first thing and It didn’t seem to work - my boss still had to turn the ringer on on her phone for the audio to work (I don’t have an iPhone so can’t test myself)
The second bit I’m not sure what do do with. Is this something I’d need to call from within Unity C#? What class would I call it on?
Yep pretty much, here is the snippet from my code:
public static class NativeMethods
{
#region AudioSources
#if UNITY_IOS
[DllImport("__Internal")]
private static extern void _MuteOtherSources(bool mute);
#endif
/// <summary>
/// Define if other audio sources should be muted (supported by iOS)
/// </summary>
/// <param name="mute">Mute other sources</param>
public static void MuteOtherSources(bool mute)
{
#if UNITY_IOS && !UNITY_EDITOR
_MuteOtherSources(mute);
#else
Debug.Log("Muting audio source is not supported on this platform");
#endif
}
#endregion
}
For future posterity I wound up having to combine both of the methods you wrote above plus using this video tutorial on creating XCode plugins;
Setting the category on startup didn’t seem to work so I wound up using the ‘mute other sources’ bit you have and pasted that code in there. Then I used the plugin I created using that tutorial to call the function whenever I wanted to play a sound (whenever our app detects the Augmented Reality target)
Also I found out that you can stick .mm files in the Plugins/iOS/ folder and they will copy over into the XCode project automatically.
Hm, putting the code in AppController should actually work. For us it was necessary since when you disable Unity Audio the checkbox Mute other audio sources is ignored completly and will always mute other audio sources. This means that even if we set the correct setting at an awake method, the user will have to go back to their podcast app or whatever they were listening to and continue playing, since iOS will pause the other audio that was playing.
Anyways, congrats on getting it working, I remember my hustle getting the first plugin working
Hmm… I haven’t run into interactions between other audio sources (such as a podcast) when using our app. I’ll keep some of that in mind when we get to that.
Our app is largely event-based - as in the user is physically at an event (like a convention or a street fair) looking at art through the camera that will animate on screen. I’m not sure that conflict is something that will come up much in our use case.
Yep, good thing to keep in mind, but as long as you use normal unity audio the checkbox for mute other audio sources works as expected. Onlay when you use an external FMOD or Wwise and want to get rid of the normal audio, since it is useless overhead, then you might need to go down that path.