native audio plugins in unity 5.0

obviously, native plugins need to be compiled for their respective platforms, however, are there still security issues/enforcements for native audio plugins with web player builds etc? what platforms can i develop audio plugins for besides mac/pc desktop builds? or are those the only ones?

what other gotchas might i encounter as well? the docs are a little light on those aspects in regards to unity 5

no one?

I have similar questions, Just been through the docs and demo sdk but a lot of things are unclear atm.

Potential sounds good though! see what I did there

1 Like

I orginally posted this to answers, but was told to post it in the forums:

I know the SDK is new and probably still a work in progress, but I need some answers to a few questions. The docs are ā€˜basic’ to say the least, but enough to get up and running and get into trouble :wink:

I also have not found anything pertaining to this on the answers or audio forums.

I have a granulation component that i have working in unity 4.6 via the OnAudioFilterRead callback, and i used that callback to push data back to the audiosource. I handle all of the positioning, spatialization (along with audio occlusion planned in the near future) and additional filters etc. For this, i need the entire audio buffer due to the calculations as well as drawing etc. My main issue was that the callback wasn’t realtime (~7ms latency) as i’m assuming this new version is, which i think was causing some additional weird phasing artifacts.

My thinking is that i could basically break things up a bit and have a ā€˜granulation’ filter that would handle the base audio, with multiple side chained ā€˜granulators’.

  • For this to work however, i’d need to be able to set/get position on the audio for each granulator, but i’m not sure how i would go about that if they were filters. Would it just be an exposed property on a mixer? Or could i get directly to a filter?

  • I also have the ability to fire back events when a user defined ā€˜region’ is entered/exited. Am i able to fire events from a filter? Because i’m assuming the filter GUI is only used in the editor, right?

  • I’m assuming ā€œGetFloatBufferCallbackā€ is capable of returning the entire audio buffer as well? And how would i go about that? And who calls it (the associated gui, correct)? do i need to ā€˜feed’ the filter the buffer or is there an internal call/callback etc?

  • On the drawing side, to be able to show a waveform, would i use ā€˜AudioCurveRendering’ stuff or is there something more appropriate?

  • Additionally, in the filter itself, is there a way to determine where my plugin is running from? Some sort of native callback? IE: the editor or a compiled version? (this helps in determining licensing, functionality etc)

  • What about webplayer builds? It was a non-starter before due to security issues. Has that changed?

  • In the processing callback, is it a constant size now? i noticed some weird issues with the buffer sizes sometimes changing during processing.

  • Also, there is not a lot about the set position callback nor anything that uses it in the sample filters. How does this work exactly?

Lastly, i’m not even sure if this all would work the way I’ve laid it out, but it sure would be cool if it could.

If some kind soul from UT could take pity on us and explain in detail about how to go about actually using all the cool new stuff in the new audio for 5, it would go a long way for those of us trying to make assets etc :wink:

I took a few minutes and built the SDK demo for web player, trying different build settings and dll combos and it didn’t work (it built and ran but no plugins are audible). The option is there but I’d be surprised if native dlls were allowed now, especially considering the impending demise of the web player?

So far I’ve managed to create my own plugins in a fresh project only by dragging and dropping the SDK demos, first into XCode and then Unity, and tearing them apart. Trying to build my own bundle in xcode based on, but not copied from, the SDK demos has so far been fruitless. Just reuising the Unity provided code has worked fine but I would love some directions on properly compiling. I know the documentation touched on this but I’m struggling to understand proper work flow. I have to recompile my plugins and then restart Unity to test them?

Exactly. What I feel is needed is a base template with only the headers needed for a clean filter build. I’m not sure what utility methods are supposed to be there or not, and thus I’m potentially adding a lot of crap I might never use. I can make guesses etc, but tbh if someone already has this knowledge it’d help and speed things along instead of fighting unresolved references etc…

I personally would like to know who calls and when the callbacks are fired. Is it constant like portaudio or is it something else.

Btw jeff, do we make x64 builds for the new editor? Or is the output from the x64 editor also x64? I’m a bit confused on that point. I thought that it was just the editor that was 64 bit now, no? so if that’s the case then what’s the point of x64 bit builds if the compiled output is going to be 32 bit?

Lol sorry dude but I’m the wrong person to ask.
I’m currently failing to get the SDK demo to run the plugins on my Android device. Same result as the web player. Building the project freshly downloaded from the Doc, Unity throws an error saying the x86 and x86_64 plugins are ā€œcolliding with each otherā€.

Found plugins with same names and architectures, Assets/Plugins/x86/AudioPluginDemo.dll () and Assets/Plugins/x86_64/AudioPluginDemo.dll (). Assign different architectures or delete the duplicate.
UnityEditor.AndroidPluginImporterExtension:CheckFileCollisions(String)
UnityEditorInternal.PluginsHelper:CheckFileCollisions(BuildTarget) (at /Users/builduser/buildslave/unity/build/Editor/Mono/Plugins/PluginsHelper.cs:25)
UnityEditor.BuildPlayerWindow:BuildPlayerAndRun()

Removing one or the other gets the project to build but still fails to produce plugin sounds. Messing with different check boxes (Android, Editor, all platforms) made no difference.

Yeah, that was kind of my experience as well ;(

Btw, aren’t Unity’s filters in native format as well? And if so, do they work in a webplayer build? (Haven’t tested that yet), but if they do work then theoretically ours could too right? I mean if UT would allow it.

Both the 4.x pro filters and the ā€œnewā€ 5 filters work in web player (and on my device). I added flange and echo to the SDK demo project audio clip and can hear the results in web player while the custom demo plugins are inaudible.

Hmmm…

So unless we’re able to use the SDK for as many platforms as we can (dunno about Xbox etc either) then that means pc/mac builds only. So while we might gain some lower level access (remains to be seen) we have to give up additional platforms, which limits us hugely if we’re going to make filters for the asset store, which was my whole intent. So now my only choices are to try to go this route, or make .net dlls with my code and use the OnAudioFilterRead and all that that entails…

Ideally, what we need is a way to get lower level access to the dsp/callbacks which don’t limit us at the same time. I’ve said it before but the same kind of access as I have in portaudio for example. Right down to being able to select my asio device without having to change my default os sound settings. A reliable callback with a constant size with as close to realtime as possible. the default latency on my asio device is ~7ms.

While I think the mixers and groups etc are a huge upgrade, I would’ve liked the OnAudioFilterRead function to be addressed or something new Like it. Shrug.

Yeah I was doing cart wheels when I first heard about this functionality but so far I’m pretty underwhelmed, especially given the lack of documentation. Custom native plugins must work on mobile, it’d be insane if that’s not the case, but I’m at a loss as to how to make it happen.

I was so ready to say goodbye to OnAudioFIlterRead…

http://docs.unity3d.com/Manual/UpgradeGuide5-Plugins.html
Just stumbled across this, might provide clarity?

Just chiming in with my 2 cents:

  • As I understand it, audio plugins can be developed for all platforms which support native plugins. That should include mobile, but not the web player for the very same security reasons that unsafe code isn’t allowed to run in the web player. Yes Unity does it, no we can’t because that would open up pandora’s box( potential exploits by buffer overflow and the likes ).

  • It is a pain to have to compile for every platform we wish to support, but that’s the price to pay for lower level access.

  • Processing callbacks happen on the audio thread. Even if no GUI is implemented, nothing forbids us from creating bindings to our filters / synths / granulators that can be accessed by scripts. As long as we’re thread safety aware, and avoid locks on the audio thread( compare and swap atomic operations shoulld be preferred imo ), we should be fine.

  • No comments regarding the AudioMixer, Group and FX APIs, there’s so little to comment upon…

Cheers,

Gregzo

thanks jeff, greg :wink:

ā€œ32-bit native plugins will not work in the 64-bit editor. Attempting to load them will result in errors being logged to the console and exceptions being thrown when trying to call a function from the native plugin.ā€

from this am i to infer that the 64 bit editor now creates 64 bit executables? or must we use a 64 bit plugin for the editor and a 32 bit for compilations? or (ugh) revert to a 32 bit editor? still a bit confusing…

ā€œbut not the web player for the very same security reasons that unsafe code isn’t allowed to run in the web player. Yes Unity does it, no we can’t because that would open up pandora’s box( potential exploits by buffer overflow and the likes ).ā€

agreed. however they could institute some sort of certification program if this is the way they want to go with plugins in general. i would have no problem doing that to potentially gain additional platforms shrug. btw greg, is the webplayer going bye bye permanently or just to the new gl version?

it’d be nice if Wayne could peruse this thread and throw a few answers our way, or mebbe where all this information is contained? i completely missed the upgrade part that jeff posted. it helps, but it also bring up more questions lol

amen :wink:

i was referring to the callbacks in the native sdk. sorry ;(

the callbacks there are called by whom? the audio engine first then scripts or…? i was hoping to move all of my ā€˜script’ dll code from .net back to c++ and use filters instead. still trying to figure out the best way to do it, or if i can do it at all, without limiting myself to just 2 platforms.

i mean, i agree, it all sounds great etc, especially the blurbs here:

excerpt:
"But what if you want more DSP control that just the inbuilt effects of Unity? Previously this was handled exclusively with the OnAudioFilterRead script callback, which allowed you to process audio samples directly in your scripts.
This is great for lightweight effects or prototyping your fancy filter ideas. Sometimes though, you want the ability to write native compiled effects for the best performance. Allowing you to write more heavy weight ideas, perhaps like your custom convolution reverb or multi band EQ.

Unity now also supports custom DSP plugin effects, with users having the ability to write their own native DSP for their game, or perhaps distributing their amazing effect ideas on the Asset Store for others to use. This opens up a whole world of possibilities, from writing your own synth engine to interfacing other audio applications like Pure Data. These custom DSP plugins can also request sidechain support and will be supplied sidechain data from anywhere else in the mix! Hawtness!"

LOL, i agree, it is hawtness! now how do we do it, and what do we have to give up to do it? :wink:

Hi Marionette,

Just tried building the X-Code demo project, targeting 64 bit architecture only. The bundle works fine in my editor.
Targetting iOS is just a matter of building a static lib instead.

I just realised one major issue: in the create callback, we have no reference to the AudioGroup or Mixer the filter is being added to, so it’ll be hard to identify which instance of the filter we’re tweaking from our own scripts. One really tacky workaround springs to mind, I hope it’s not the only one: use a float ID that’s set in the editor and can be queried via GetFloat API to identify filters. Hmmm…

yuck ;(

i mean, it sounds like it’ll work, but we’re back to the whole having to look something up that we should already get a reference to…

that’s excellent about the x-code tho, jeff will like that too :wink:
btw, did you only run it in the editor or a compilation too with 64 bit?

btw, and this just occurs to me, but the end point of the audio filter chain is spatialized, right? onaudiofilterread wasn’t, and thus i had to implement my own hybrid vbap/ambisonics thingy…