3D Wizard (realtime MIDI visualization)

I would like to show you all my first ever completed Unity project, the 3D Wizard (although I’ve been working on other projects for several months, so I’m not entirely new to Unity).

This program was made for keyboardist Jordan Rudess, of progressive metal band Dream Theater, for use in live shows. This program was originally coded in DarkBASIC for Dream Theater’s 2009 world tour, but when I needed to update it for the upcoming 2011 tour, I decided it would be a good idea to port it over to my current program of choice, Unity (Indie), which ultimately proved easier than I expected, and took about a month in total to make.

The program is used live as a realtime visualization of MIDI data. It takes MIDI in via keyboard, and will mimic the playing of the piano with a 3D wizard character. This video is then output for the video director to use on the large projection screen behind the band at selected points during the show. So basically, the keyboardist plays throughout the show, and the program’s character will play along on the big screen.
All camera angles are automatic, and based on the performance. The program will adjust the camera angles based on what is being played and where on the keyboard it is being played, and will adjust the camera shot length dynamically based on the performance speed.

There are no pre-made animations. All animation is procedurally generated on the fly based on the MIDI input. This required coding some basic Inverse Kinematics (IK) to allow the program to place the hands at the piano keys wherever it needs to. The code for the character is about 3000 lines long, most of which is dedicated to deciding which hand and fingers are used for each note. Notes are assigned to each hand based on the current location of each hand on the keyboard, and assigned to fingers based on current notes pressed, and intervals between notes. The program also responds to velocity, or how hard the notes are being played. The facial expressions, body movements and head movement are all based on the current playing style.

The graphics are using mostly mixing and matching inbuilt shaders. The main wizard character uses the inbuilt lit toon shader, although the outline is pre-made in 3DS Max, and not done via shader, as it could achieve a better quality extrusion. Most of the rest of the scene is using a combination of lightmapping (pre-rendered in 3DS Max, not inside Unity) and cube mapped reflections. The scene is topped off with some smoke particle effects, and lens flares, to add some ambience and contrast to the scene.

The program will be used throughout Dream Theater’s tour, which started in Rome on July 4th. If anyone really wants, I could post some blurry cell phone quality videos of the program in use live, but I think the above video does a better job. :wink:

Here are a couple of screen grabs from the videos, although these don’t show a lot of the details. It looks better in motion. :slight_smile:

I hope some people using Unity for MIDI/musical purposes find some interest in this.

very nice

haha cool

Really really nice job!

Thanks. It’s in HD, so be sure to check it out at 720p. Especially after it took me an hour to upload it. :smile:

Looks sweet!

jejejeje jordan rudess its great :stuck_out_tongue:

Very awesome! I like the wizard :slight_smile:

Very cool indeed. BlobVanDam, I’ve send you a PM. Would love to hear from you.

Thomas P.

Holy balls! This looks awesome.

Mind sharing how you got the Midi-Data from the keyboard into Unity?

That is amazing!

I adapted the Wiimote OSC code from this thread-
http://forum.unity3d.com/threads/21273-OSCuMote-Wiimote-support-for-the-free-version
I use GlovePIE to convert the MIDI messages to OSC messages that the script can read.

If you’re actually interested in programming some MIDI inside Unity yourself, I could share a stripped down project of my basic MIDI input.

Yes! That would be most awesome! :slight_smile: All tough I’m not trying to get it to work with a keyboard. Instead it should use the data from a midi-file. But I figure this will be only a small change from your code.

Thank you very much! :slight_smile:

I actually recorded my video with a midi file. It didn’t require any modification to the Unity code, but it required minor changes to the GlovePIE script, and I believe I also needed to install MIDI-OX to manage it. This method still relies on the MIDI file being played back separately in another program, and doesn’t give Unity any control over the MIDI playback, so it might not be what you’re looking for.

Someone else has managed some form of MIDI file reading within Unity here-
http://forum.unity3d.com/threads/71170-Realtime-Animusic-like-project-2-scenes-still-in-early-stages?highlight=animusic

If you still want a look at my code, let me know and I’ll post it. I just didn’t want to type up all of the instructions for it if the above link was a better alternative for you. :slight_smile:

Ah awesome! Thank you very much :slight_smile:

I’ll check out the code from that other thread and maybe its just what I need as I want to do everything in Unity without any external stuff. If possible of course. There is also this Midi-Plugin for $40 on the asset store: MIDI Unified 5.0b1 | Audio | Unity Asset Store Not sure tough how useful it I’ll be.

I’ll try working with the other code and let you know how it goes. :slight_smile:

Thanks for posting this and providing the video’s !!

Very clever!

Great stuff, really well done. I’ve been interested in this sort of thing for a decade but I never got very far, and in the last year I have switched to using Unity and am starting to get some results.

I’ve used both the OSC approach(using some Max4Live stuff in Ableton to turn midi into OSC) and in recent days I bought that asset store midi plugin to do it directly with midi. The plugin isn’t too bad, and the midi file player part (that they call sequencer) is helpful during testing. The Playmaker stuff they have included with the plugin needs a bit of work, e.g. I ended up modifying one of the actions so that I can actually turn note velocity into a PlayMaker variable, but this only required a couple of lines of code to achieve.

I’ve some way to go before I can nail character animation, so I’ve been keeping it simple for a start. So far I’ve been having fun by creating a bunch of softbody ‘airbags’ with the pressure of the bags affected by note velocity. Hope to have some demo video online this week.

I am amazed…

Would you mind sharing the modified code for note velocity?

I’ve been using the midi plugin extensively in the last couple of weeks and I found the sequencer not to be optimal for very fast tracks. Like drum’n’bass tracks for example. Did you also run into something like this where you felt as if the midi-playback was stuttering/jittering?