RhythmTool - Music Analysis for Unity


Demo |Video | Documentation | Asset Store

RhythmTool is a straightforward scripting package for Unity, with all the basic functionality for creating games that react to music.

RhythmTool analyzes a song without the need of playing the song at the same time. It can analyze an entire song before playing it, or while it’s being played.

There are a number of types of data it provides:
• Beats
• Pitch
• Onsets
• Changes in overall intensity
• volume

This data can be used in various ways and is provided through an easy to use asset and event system.

RhythmTool is designed to analyze and sync songs with a known length. Unfortunately it is not possible to analyze a continuous stream of data, like a web stream or mic input.

Questions and feedback
Any questions, feedback or features you would like to see? Post it here or send me an email at tim@hellomeow.net

3 Likes

Good news everyone,

In the last few weeks I have been able to make some big improvements to the tempo detection. Both the tempo detection and the synchronization with the song are much more accurate. It will also be more straightforward to use the tempo detection results. This really needed improving.

The next update will bring a lot of changes and improvements:

  • Improved tempo detection
  • Improved the way tempo detection data is stored and how it can be used
  • Added BPM calculation
  • Analysis results can now be saved and loaded
  • RhythmTool is now derived from MonoBehaviour
  • Added OnReadyToPlay and OnEndOfSong messages
  • Renamed variables and methods to make more sense

I’m currently cleaning up the code and re-writing the documentation. ETA is 2-3 weeks.

1 Like

Version 1.6 has been released!

Hi,

Sorry, I already found the original thread and the documentation :slight_smile:

  • deleted stupid questions-

Thanks!

Just a heads up,

RhythmTool appears to work perfectly fine with Unity 5, but the Examples have a small issue where the music appears to be too low or inaudible. This is because Unity 5 doesn’t convert Unity 4’s 2d sounds correctly, basically turning them into 3d sounds.

To solve this, just turn the “Spatial Blend” slider all the way to 0 for the AudioSources in the affected example scenes.

Hi HelloMeow, this may be what i’m looking for. I’m effectively trying to create a metronome that has a looped beat, and play samples above that at decided times, and judge whether the person playing the game accurately hit it on those times or not.

Would this be able to help with that workflow? it doesn’t need to automatically calculate the BPM as I would do that, but I would need it to ‘start’ detection after an offset (e.g. if the song has an intro which should not be counted.)

I don’t think it will. This is mainly for analyzing songs on the fly.

Hi @HelloMeow !

I just discovered your package and am reeeeally interested in its BPM feature for a future project, so I’m gonna bother you with a couple of questions, if I may.

  • If I understood correctly, BPM is calculated in realtime while the song plays. Is it possible for RhythmTool to parse bigger chunks of the song at startup (even if it requires some time) and return a more accurate BPM? This would also allow to use Rhythm tool in a “lighter” mode, without having to analyze stuff other than at the beginning.
  • Out of curiosity, why no event system (or maybe I just missed it)? It would be nice to have an “OnBeat” (and so on) event that one could listen to, which would also prevent the “current beat for multiple frames” issue you mention in the docs.

Cheers,
Daniele

Hi,

It’s possible to analyze the entire song at startup by just checking an option in the inspector. RhythmTool doesn’t just calculate the BPM, it also tracks beats.

Technically it would be possible to detect the BPM by analyzing a small part of a song, but not out of the box and not without making some changes. And if you do, you won’t have the data for when the beats actually occur.

It doesn’t use Events. Mostly because I haven’t had the need for my own games. It’s on my list though.

Thank you!

Hi @HelloMeow !

I’ve been looking for this for a looong long time now. This is worth the pay, thank you for this!

I just want to know if you provide tutorials on a breakdown of using this Rhythm tool? Such as using beats as an interaction with the player’s input and etc. (ex. Guitar Hero)

If you do, that would be so cool 'cause I’ve been studying this tool for a while now and I think the documentation doesn’t cover how to use the beats before, during, after they arrive on time. Or if it does, I’m having a hard time understanding it. :frowning:

Best Regards,
Josh

@HelloMeow
Fantastic asset. I’ve downloaded and uploaded into my Unity project.
I’m wondering if there is a way to code the data that the script is picking up and apply it to transformations of a set of objects. Kind of like a audio visualizer but without instancing objects.

Hi @HelloMeow !
Thanks for a good asset

I have a question.
If it is over bpm 180, it will be 90 in half, how can I solve this?

Hi,

Unfortunately that’s a limitation of the method used to find the most likely beat length. It’s limited to a certain range, which is between 80 and 160 bpm. I’m still trying to improve this.

Thanks for the reply.
I’m looking forward to the update!

@HelloMeow I tried out the demo and it was pretty awesome, I was getting solid onsets and beats for the several songs that I tried. My only question before purchasing is the number of bands that you have. Am I limited to just the low, mid, high and beats? Or is that just for the best detection?

I need to use between 12 and 24 bands, with 16 probably being the average, so at 16 bands would I still be able to get solid detections?

Hi!

There are 4 default analyses with different frequency ranges for onset detection. These ranges are somewhat arbitrary, but they appeared to give the most useful results. You can change these ranges, or add more analyses with different ranges, although I’m not sure how that will affect the results.

As a heads up to anyone else, it’s pretty easy to setup analyses for bands besides the defaults. The onsets I get lineup pretty well with what I expect and hear. I haven’t tested out using more bands yet, but I doubt the performance would drop very much. Solid purchase.

int FrequencyToSpectrumIndex (float f) {
    var i = Mathf.FloorToInt (f / AudioSettings.outputSampleRate * 2.0f * 1024);
    return Mathf.Clamp (i, 0, 1024);
}

void SetAnalyses() {
    float[] frequencies = new float[]{ 31.5f, 63, 125, 250, 500, 1000, 2000, 4000, 8000, 12500, 16000, 20000 };
    float bandwidth = 1.414f;

    for (var i = 0; i < frequencies.Length; i++) {
        int a = FrequencyToSpectrumIndex (frequencies[i] / bandwidth);
        int b = FrequencyToSpectrumIndex (frequencies[i] * bandwidth);
        rhythmTool.AddAnalysis (a, b, i.ToString());
    }
}
1 Like

RhythmTool 2.0 has been released. This new version brings a number of improvements and big changes:

  • RhythmTool, RhythmEventProvider and the documentation have been rewritten from scratch
  • Removed RhythmTool.IsBeat and RhythmTool.IsChange. ContainsKey or TryGetValue for both the beats and changes collections can be used instead
  • Replaced RhythmTool.NewSong() with RhythmTool.audioClip
  • Renamed RhythmTool.calculateTempo to RhythmTool.trackBeat
  • Renamed RhythmTool.preCalculate to RhythmTool.preAnalyze
  • Renamed RhythmTool.storeAnalyses to RhythmTool.cacheAnalysis
  • Added RhythmTool.Reset and RhythmEventProvider.Reset, an event that occurs when a song has restarted or a new song has been loaded
  • RhythmEventProvider now needs to be given a RhythmTool Component instead of using all RhythmTool components in the scene
  • RhythmEventProvider no longer uses UnityEvents. Instead it uses c# events, for a more consistent api. In the previous verions c# events and UnityEvents were used in different cases. This means the events are no longer available in the editor

If you’re using a previous version and plan to use version 2.0, you will need to keep an eye on the following:

  • Use RhythmTool.audioClip instead of RhythmTool.NewSong()
  • Use the new C# events in RhythmEventProvider instead of the old UnityEvents
  • Give every RhythmEventProvider a RhythmTool component to use for it’s events

All the examples and the optional AudioImporter have been updated as well.

Hi @HelloMeow ,

I’ve bought RhythmTool and integrated it into my game, but running into a couple issues. The game uses a short song that loops, as opposed to a single longer song that starts and ends. I’m detecting the end of the song and then simply calling Play again to loop.

One issue is that the beat events from the RhythmEventProvider seem to only get sent until the song loops, then they are never sent again.

Another thing is, I’m relying on BeatTime() to calculate normalized beat time (to synchronize GameObjects), and that seems to work fine even as the song loops. However, there’s a delay right when as it loops – where I’m assuming the song is being re-analyzed by RhythmTool, despite me having the “Cache Analysis” option turned on. Eventually, the tracking “catches up” to the playing song, and it works fine, until it loops again. When things loop, shouldn’t it just set its internal tracking indices to 0 and re-use the existing analysis if the song hasn’t changed?

Thanks!