So I can think of some ways to do this, create “Hearabled Sound” components that play audio sources and tell some manager or entity that a Sound was made with a certain range and position, or use Physics.Overlap sphere…
But what I would really like, is way to do this that can be easily implemented in an already started game. I do not want to go though all of my audio sources and change them, or add some component.
Unity must already keep track of audio, because the audio listener is aware of 3D spatial audio, so it seems there would be a way to just hook into that, or access that data. Unity only allows one audio listener, i there a way to create virtual listeners?
Tossing a SphereCollider on an “Audible” layer onto your audible effect prefabs is cheap, detecting its triggering any number of other SphereColliders you choose is cheap (Pythagorus would be proud). It also lets bees and owls and fish make noises all over your forest scene, without alerting the assassins of irrelevant noises. It also works if the player clicks “Mute” on the game options screen, because their partner’s sleeping nearby.
In game design, you don’t want reality, you want verisimilitude, the appearance of reality. You must find the simplest way to get things done, both for design time spent and run time spent.
Spend a little more time deciding some design rules and use prefabs for maximum effect. Few things in a Scene should be truly unique and not in a prefab; once you build things with prefabs and prefab variants, it’s pretty easy to identify and update the prefabs with all the necessary stuff, and that change propagates to everything else. Instead of adding an audiosource to each enemy prefab, add an AngelDevelopmentSuperAudio prefab. Now that prefab can include code and properties like “is this important to characters?” and “is this an environmental noise?” and “is this played on the sfx audiomixer group?” in a central manageable way.