i’m not quite sure of how voice allocation is supposed to work on iPhone. I’ve used Wav files for my SFX in order to enable overlaps and simultaneous audio but i don’t seem to be getting that. If a 2nd audio event is triggered on the same instance of the Audio Source component the first is terminated but the 2nd doesn’t play. If it’s on a different instance then the first is terminated and the 2nd takes over. This is exactly the behavior you would expect if the audio was encoded AAC or whatever. I must be missing something.
I don’t experience anything like that; you’re using multiple distinct AudioSource components? Each one can only play one sound at a time. You can add any number to a single GameObject though.
–Eric
it only does it when it gets to the iPhone - works fine in unity.
No idea then…I have several sounds playing simultaneously on the device as well as Unity.
–Eric
eric, i want to try a second AudioSource component on the same GO. How do i determine which AudioSource is used when i use ‘audio.clip =’ or do i need to create a separate child object to attach it to with it’s own script?
Creating a separate child object is probably easier in some cases, but what I’m doing is basically this, with 3 audiosource components:
enum Audio {Background1, Background2, FootSteps}
private var myAudio : AudioSource[];
var backgroundLoop : AudioClip;
function Start () {
myAudio = Array(GetComponents(AudioSource)).ToBuiltin(AudioSource);
myAudio[Audio.Background1].loop = true;
myAudio[Audio.Background2].loop = true;
myAudio[Audio.Background1].clip = backgroundLoop;
// etc.
}
–Eric