[RELEASED] AutoMusic : Procedural Music Generation

AutoMusic is a tool for procedurally generating music directly in Unity. Generates music in real-time, on device, on demand. Easy to set-up & fully customisable to the musical needs of your project.

Store Link : AutoMusic - Procedural Music Generation | Audio | Unity Asset Store

Need your game to have endless hours of music, without taking a lifetime to create? Got you covered

Want the music in your game to continually evolve within the vibe you set, without having gigabytes of your build dedicated to it? Not a problem

Want to be able to combine pre-made loops with procedurally generated content? Yep, here you go

Want the music in your game played back as separate instruments to allow for dynamic gameplay interaction? Easy

Want to dynamically modify core attributes such as musical key, or BPM? Peasy

Want a music system with a straightforward framework, allowing you to add your own unique instruments? Lemon Squeezy

Whether youā€™re a seasoned music pro, or a game developer who would rather just press play, this system contains tools & instrumentation for procedurally generating music directly in Unity in real-time, on device, on demand.

The package is fully documented & source code is provided. All included instruments are written in a straight-forward & readable manner that makes it possible for you to further expand the system for your own purposes.

This is the most fully featured music generation system available on the asset store, and is still actively developed with more features to come.

Store Link : AutoMusic - Procedural Music Generation | Audio | Unity Asset Store

2 Likes

This tool is designed with long-form music creation in mind so hereā€™s a 4 hours and 46 minutes long demonstration.
This is what the tool sounds like weā€™ve these input samples/settings : you can make it sound completely different.

1 Like

Really cool! Just purchased to give it a shot. Sent an email on this as well but is there an easy way to raise an event when a specific sound module gets played per the procedural music settings and is there an option to free play a sound module and have it fit in the procedural music based on the settings in the harmonic hub?

Hi,
Just found your email, hereā€™s my response pasted in case itā€™s of interest to others :

Iā€™m not quite sure what you mean by ā€™ free play a sound moduleā€™ ? happy to help on that if you can clarify. Perhaps youā€™re aiming to ā€˜injectā€™ additional notes into the stream, in which case you could either do that through the relevant PlayScheduled method (correct one depends on what your instrument outputMode is) to play the sound asap / in free-time, or you could manually call the AddNoteToBeatInstance() method and get a note into the queue in musical time.

For raising an event when an instrument is played : I think I can see why your timing would be off.
Beatcore gets run at the start of each musical beat, but it process notes in ā€˜chunkedā€™ time, so a whole beats worth of notes get processed at once so you need to also assess the ā€˜noteOffsetā€™ to get the true timing
the ā€˜noteOffsetā€™ is recorded as a fraction of a beat. so if the noteOffset is 0.5, that means the note should be played at masterClock.nextBeatStartTime + (masterClock.beatLength * 0.5)
if you look in MasterClock.cs, thereā€™s a method called PlayScheduledDirectSound() : this is the logic the audio system uses to trigger sounds in musical time and you should be able to copy the same logic to spawn events/objects synced to the same clock. Itā€™sonly really the startTime logic in that method that is relevant here, the rest of the parameters are more sound-generation specific
You might need to account for the global latency value (or other temporal offsets in your game / rendering) in your scene master clock to get an absolute sync

What youā€™re trying to do sounds super cool and I hope automusic can help you get there! on the schanuzercorp youtube channel I have some visualised videos where objects /fx are being spawned in time with beats, which is done essentially with the same process as described above but without the gameplay impact.

Hereā€™s a short demo video highlighting a feature recently added to AutoMusic : Realtime synths !
The synths receive procedurally generated note data (as with all instruments in AutoMusic) but they can have all of their parameters modulated in real-time by your game/player.
The synths are written in a straightforward C# manner, you can create your own to add to the system, either from scratch or repurpursing components (oscillators/filters/envelopes/etc) that are already included.

New update coming shortly, includes bug fixes, new features, and usability improvements

The latest update to version 1.5 is now live with new features, fixes, and improvements has gone live. This is a big release & also includes some work in the architecture that will make some other cool new features possible.

Changes:

1.5.0
-NEW FEATURE : Sidechain Compression
-NEW FEATURE : ā€˜Filter Curveā€™ option to Sound Modules : Filter Curves are customisable LFO-like automations that can create tempo-synced filter effects
-NEW FEATURE : play varying audio clips based on random chance or velocity (implemented for kick, snare, & ghost SoundModules)
-NEW FEATURE : DirectSound modules now have much more complete 3D sound properties (Enable ā€˜DirectSoundSpatialisationā€™ in FXHub)
-Added new mode to ModuleModifier randomisation : PooledSequence.
-Added component to force CompositionLayouts to a target layout index after a given time : CompositionLayoutSafetyNet
-Made FXHub compulsory (auto-adds by CompositionHub if not found)
-Fixed bug in FXHub that stopped it from responding to GUI changes while playing
-Fixed bug with looped sounds not correctly triggering when they are set to play at the start of a section
-Fixed bug with stereo audioclips loading the data for the left channel into both sides (much better sound quality)
-Tweaked velocity application for better volume response
-Tooltips added to many many UI controls
-Made it much harder to accidentally mute your whole system in a way itā€™s hard to debug by reworking how the CompositionHub & FXHub interact
-fixed bug with loop player not tempo-syncing in DirectSound mode (and made loopPlayer code cleaner)

1 Like

AutoMusic is now capable of playing MIDI files alongside the existing procedural generation & synced audio loops !
MIDI VIDI
Asset Store

Has it been tested on Meta Quest?

just wanted to leave one more verbal praise to this here -
the tracks/background soundscapes it can generate is really next level
just built one of the sample scenes as it is (no camera, blank screen, only audio mixer running, no user controls like volume, play/stop etcā€¦) as desktop windowed app and left it running with its background music itā€™s really awesome :+1:

I havenā€™t personally tested it on Quest but I know other uses have.
ā€˜DirectSoundā€™ mode on the modules (instruments + enabling ā€˜DirectSound spatialisationā€™ in the FXHub should give the best VR experience

Thank you! It does continue to surprise me with what it can create: and the more you work into the moduleModifiers / compositionHub settings the more variation can arise

I can confirm that it works on Quest.

Thanks, looking forward to exploring whatā€™s possible, love how youā€™ve made it easy to do that!

1 Like

Iā€™m very excited to give this a go! But before I do, can you clarify on the 3D spatial capabilities of this tech? On the presumption that its ouput can be heard from a 3D audio source (like a speaker in the scene), can the output be split into multiple 3D ā€œspeakersā€ in a scene ā€“ perhaps by instrument?

Hi, The final audio gets output to through the Unity Mixer so you should have access to the same spatialisation controls (make sure to enable ā€˜DirectSoundSpatialisationā€™ in the FXHub object):
Set the position of your AudioListener, and individual intstruments will spatialisation in relation to that object : So if you wanted (for example) a particular synth sound to be as if itā€™s coming from a model of an amplifier in a corner of the room, place the SoundModule object for that synth instrument at the same position as the amplifier model.

1 Like

Iā€™m really loving the asset but having a hard time getting Injectables to work - particularly the MIDI to Beat functionality. Is there a video tutorial out there to help those of us who are just starting out? Thanks!

Hi juriggs, sorry to hear that MIDI to Beat is giving you issues. Have you taken a look in the ā€˜MIDI_Demoā€™ scene ? The bass and lead modules in there are set up with MIDI to Beat.
Thereā€™s a few inputs required on the MIDI component itself, like linking in the MasterClock and adding a MIDI file. Then on your instrument module make sure you have ā€˜use injectable sourceā€™ enabled, and the MIDI module linked into the slot directly below that button.

The debug ref section in the MIDI to Beat component will show you how many notes have been found in the MIDI file channels that are being read, so if you are seeing a zero in there, perhaps thereā€™s an issue with the MIDI file or channel set up - test using oneof the included MIDI files to see if you can get notes to be read in from these.