Fabric in Unity

First off, I’m a relative newbie to Unity and a solo indie. I’ve been learning game development while working full-time overseas. I’ve gotten to the stage where I want to do more with audio in my games. I’m working through the book Game Audio Development with Unity 5.X (though I’m actually using Unity 2017.3). The book is very good and I’m learning a lot. I haven’t gotten that far yet, but there are chapters discussing FMOD and Reaper, with integration to Unity. I’ve also done some research online about game audio (with focus on Unity) and discovered a Unity package called Fabric. I checked out the website and downloaded the Unity package. I haven’t installed it yet as I want to finish the book first. I also found a couple of articles discussing game audio that recommended using an external DAW to create/mix audio files and then import them into Unity. My question is, has anyone been using Fabric? If so, what has been your experience? From what I have learned so far. Fabric enhances the built-in Audio Mixer. As using external tools seems a bit overwhelming at the moment, I am wondering if I might be getting in over my head. If I use Fabric within my Unity projects, will I achieve similar results from using external tools? I have no aspirations of becoming an audio engineer or anything close. I just want to improve the music and sounds within my games. Any advice/comments will be greatly appreciated as I move forward in my learning. Thank you in advance.

Hi there,

There are actually plenty of choices of audio enhancements to bypass the short-comings of the default Unity Audio tools set which is quite basic. This is by no means a poke at Unity, one tool can’t be all things. Unity’s audio system is simple in that it is not easy to author complex behavior required by more complex games. Don’t get confused with the Unity Mixer - that is actually quite well implemented. Infact, Fabric can be routed through the unity mixer or it’s own internal mixing system.

This is where a lot of developers get confused… what am I talking about behavior authoring as opposed to implementation of audio?

Implementation is tying audio into game cues. Behavior authoring is designing the audio systems which respond to those cues. How do you get random choices of sounds playing for footsteps? Can we use pitch and volume and filter variation and blending two sounds together to create more variation with fewer assets. These things are the building blocks of more complex audio behavior. This is where middleware comes in. Middleware provides an interface to model audio through common building blocks of audio behavior - randomization, sequencing, priorities, exposed variables that change audio behavior, mixing rules, asset management, debugging tools language localization and more.

So whilst Fabric doesn’t have the authoring interface of say WWise or FMod or CRI Tools the behavior you can model with it is quite similar to WWise, but you will need to understand how it works.

Fabric, WWise, Fmod all work on an event driven system - rather than your game cue calls a sound, or a function you have written to randomize a choice of sounds, the audio event is like a mini script that has a set of functions. Say for example - you fire your weapon in your game - traditionally you’d play Gunshot.wav… in these middleware systems you call an event - say Play_player_gunshot within that event it may do this:

  1. choose a random gunshot from a pool of 5
  2. layer in a random shell casing dropping to the ground out of a pool of 3 and randomize the volume and pitch of that by a parameter that you have defined.
  3. Set the audio state of the game to combat

This audio state of combat may turn down the ambient sound in the game, may change the music and also start a heart beat sound… the combat music might be fed by a parameter of the number of enemies around your play by radius which might have it play at a higher volume and with more instruments than say compared to lesser enemies. The health of the player is fed into another parameter which determines how fast and loud and pitch of the heartbeat is playing.

All this is happening because you fired you gun and you’ve been feeding the other parameters in all the time regardless.

Most of the audio behavior is decoupled from the engineer needing to do anything. Imagine how much time it would take for an engineer to author all of that behavior? What happens if something isn’t needed - like the heartbeat… the engineer would have to disable that, or the music system changes in some way, the engineer would need to do this. Now this behavior is extracted and packaged for someone to author without code.

If you don’t need this much complexity and you have a 2D side scroller game which really all it needs is randomization of volume / pitch of sounds played from a random pool WWise may be overkill and something like MasterAudio will give you that behavior (MasterAudio is available in the Unity Asset Store) - MasterAudio sits somewhere between the Unity Audio System and something like Fabric / Wwise / Fmod.

Why I’d choose fabric is that eventually you are going to be doing more complex games, and what better way to learn it on a simple game now.

What Fabric and Master audio do for you is that as they are native to unity - ie written inside unity, they will run on any system Unity runs on without additional build settings.

So to answer your question, yes Fabric will enhance and streamline your audio behavior and implementation, and gives you similar results from using external tools like WWise / FMod.

Yes, I have used Fabric for a major title : www.atlasreactor.com

We also have a Fabric community on facebook

Some good online tutorials - perhaps a little dated but the concepts of Fabric are still the same.

Yannis.
www.yannisbrown.com

1 Like

Hi Yannis,
Thank you very much for your informative reply. It has given me a better understanding of things. As i said in my post I’m pretty much a newbie to Unity. My primary focus at the moment is in 2D development so my audio needs are simple. However when going through the exercises in the book, I got to thinking about how the principles and techniques could be applied to 2D games. That’s what got me off looking more into game audio. As you correctly pointed out, a stand-alone DAW is probably over-kill for what I am currently wanting to learn and do. When I finish the book I think I’ll go ahead and install Fabric and work through some of the same exercises to see how it works and if it will help me improve the audio for my simple games. I’ve been using music and sounds I’ve downloaded but they just don’t seem to be the right fit. They work well, but I guess I just want to make my games more “Me”. Thanks again for your advice.