Just wondering if anyone has found some guidelines for making audio formats/etc behave with Unity?
I’m primarily using Audacity to convert mp3’s and wav’s to OGG format to take advantage of compression, but it seems that I’m crashing Unity all over the place sometimes, and sometimes not having many problems. Its frustrating because it seems to be either my approach to architecting the audio system in my game, or something is broken in Audacity’s Ogg encoder, Unity’s Ogg decoder, or where else…
Some of the times Oggs crash Unity hard in Windows, but not all of them, and not always.
I’m careful to set Ogg files as “mono” in Audacity, since I’m only working with Sound FX at the moment. They vary in sample rate pretty wildly at the moment, but I try to keep them all 16-bit.
In Unity, the typical thing I’m doing is giving a script one or more AudioClip public variables and assigning a default sound to the AudioSource. Typically the AudioSource has associated with it a repetitive sound (like maybe a vehicle engine), while the AudioClips are all one shot sounds (usually invoked by PlayOneShot).
For repetitive sounds that are triggered by a key being held down, I’ll usually do something like:
if key is held down:
{
if !audio.isPlaying:
{
audio.Play();
}
}
else
{
if audio.isPlaying:
{
audio.Stop();
}
}
Is there anything obviously wrong with my approach, or if it looks OK, any wisdom gained from already “been there, done that” would be much appreciated.