obviously, native plugins need to be compiled for their respective platforms, however, are there still security issues/enforcements for native audio plugins with web player builds etc? what platforms can i develop audio plugins for besides mac/pc desktop builds? or are those the only ones?
what other gotchas might i encounter as well? the docs are a little light on those aspects in regards to unity 5
I orginally posted this to answers, but was told to post it in the forums:
I know the SDK is new and probably still a work in progress, but I need some answers to a few questions. The docs are ābasicā to say the least, but enough to get up and running and get into trouble
I also have not found anything pertaining to this on the answers or audio forums.
I have a granulation component that i have working in unity 4.6 via the OnAudioFilterRead callback, and i used that callback to push data back to the audiosource. I handle all of the positioning, spatialization (along with audio occlusion planned in the near future) and additional filters etc. For this, i need the entire audio buffer due to the calculations as well as drawing etc. My main issue was that the callback wasnāt realtime (~7ms latency) as iām assuming this new version is, which i think was causing some additional weird phasing artifacts.
My thinking is that i could basically break things up a bit and have a āgranulationā filter that would handle the base audio, with multiple side chained āgranulatorsā.
For this to work however, iād need to be able to set/get position on the audio for each granulator, but iām not sure how i would go about that if they were filters. Would it just be an exposed property on a mixer? Or could i get directly to a filter?
I also have the ability to fire back events when a user defined āregionā is entered/exited. Am i able to fire events from a filter? Because iām assuming the filter GUI is only used in the editor, right?
Iām assuming āGetFloatBufferCallbackā is capable of returning the entire audio buffer as well? And how would i go about that? And who calls it (the associated gui, correct)? do i need to āfeedā the filter the buffer or is there an internal call/callback etc?
On the drawing side, to be able to show a waveform, would i use āAudioCurveRenderingā stuff or is there something more appropriate?
Additionally, in the filter itself, is there a way to determine where my plugin is running from? Some sort of native callback? IE: the editor or a compiled version? (this helps in determining licensing, functionality etc)
What about webplayer builds? It was a non-starter before due to security issues. Has that changed?
In the processing callback, is it a constant size now? i noticed some weird issues with the buffer sizes sometimes changing during processing.
Also, there is not a lot about the set position callback nor anything that uses it in the sample filters. How does this work exactly?
Lastly, iām not even sure if this all would work the way Iāve laid it out, but it sure would be cool if it could.
If some kind soul from UT could take pity on us and explain in detail about how to go about actually using all the cool new stuff in the new audio for 5, it would go a long way for those of us trying to make assets etc
I took a few minutes and built the SDK demo for web player, trying different build settings and dll combos and it didnāt work (it built and ran but no plugins are audible). The option is there but Iād be surprised if native dlls were allowed now, especially considering the impending demise of the web player?
So far Iāve managed to create my own plugins in a fresh project only by dragging and dropping the SDK demos, first into XCode and then Unity, and tearing them apart. Trying to build my own bundle in xcode based on, but not copied from, the SDK demos has so far been fruitless. Just reuising the Unity provided code has worked fine but I would love some directions on properly compiling. I know the documentation touched on this but Iām struggling to understand proper work flow. I have to recompile my plugins and then restart Unity to test them?
Exactly. What I feel is needed is a base template with only the headers needed for a clean filter build. Iām not sure what utility methods are supposed to be there or not, and thus Iām potentially adding a lot of crap I might never use. I can make guesses etc, but tbh if someone already has this knowledge itād help and speed things along instead of fighting unresolved references etcā¦
I personally would like to know who calls and when the callbacks are fired. Is it constant like portaudio or is it something else.
Btw jeff, do we make x64 builds for the new editor? Or is the output from the x64 editor also x64? Iām a bit confused on that point. I thought that it was just the editor that was 64 bit now, no? so if thatās the case then whatās the point of x64 bit builds if the compiled output is going to be 32 bit?
Lol sorry dude but Iām the wrong person to ask.
Iām currently failing to get the SDK demo to run the plugins on my Android device. Same result as the web player. Building the project freshly downloaded from the Doc, Unity throws an error saying the x86 and x86_64 plugins are ācolliding with each otherā.
Found plugins with same names and architectures, Assets/Plugins/x86/AudioPluginDemo.dll () and Assets/Plugins/x86_64/AudioPluginDemo.dll (). Assign different architectures or delete the duplicate.
UnityEditor.AndroidPluginImporterExtension:CheckFileCollisions(String)
UnityEditorInternal.PluginsHelper:CheckFileCollisions(BuildTarget) (at /Users/builduser/buildslave/unity/build/Editor/Mono/Plugins/PluginsHelper.cs:25)
UnityEditor.BuildPlayerWindow:BuildPlayerAndRun()
Removing one or the other gets the project to build but still fails to produce plugin sounds. Messing with different check boxes (Android, Editor, all platforms) made no difference.
Btw, arenāt Unityās filters in native format as well? And if so, do they work in a webplayer build? (Havenāt tested that yet), but if they do work then theoretically ours could too right? I mean if UT would allow it.
Both the 4.x pro filters and the ānewā 5 filters work in web player (and on my device). I added flange and echo to the SDK demo project audio clip and can hear the results in web player while the custom demo plugins are inaudible.
So unless weāre able to use the SDK for as many platforms as we can (dunno about Xbox etc either) then that means pc/mac builds only. So while we might gain some lower level access (remains to be seen) we have to give up additional platforms, which limits us hugely if weāre going to make filters for the asset store, which was my whole intent. So now my only choices are to try to go this route, or make .net dlls with my code and use the OnAudioFilterRead and all that that entailsā¦
Ideally, what we need is a way to get lower level access to the dsp/callbacks which donāt limit us at the same time. Iāve said it before but the same kind of access as I have in portaudio for example. Right down to being able to select my asio device without having to change my default os sound settings. A reliable callback with a constant size with as close to realtime as possible. the default latency on my asio device is ~7ms.
While I think the mixers and groups etc are a huge upgrade, I wouldāve liked the OnAudioFilterRead function to be addressed or something new Like it. Shrug.
Yeah I was doing cart wheels when I first heard about this functionality but so far Iām pretty underwhelmed, especially given the lack of documentation. Custom native plugins must work on mobile, itād be insane if thatās not the case, but Iām at a loss as to how to make it happen.
I was so ready to say goodbye to OnAudioFIlterReadā¦
As I understand it, audio plugins can be developed for all platforms which support native plugins. That should include mobile, but not the web player for the very same security reasons that unsafe code isnāt allowed to run in the web player. Yes Unity does it, no we canāt because that would open up pandoraās box( potential exploits by buffer overflow and the likes ).
It is a pain to have to compile for every platform we wish to support, but thatās the price to pay for lower level access.
Processing callbacks happen on the audio thread. Even if no GUI is implemented, nothing forbids us from creating bindings to our filters / synths / granulators that can be accessed by scripts. As long as weāre thread safety aware, and avoid locks on the audio thread( compare and swap atomic operations shoulld be preferred imo ), we should be fine.
No comments regarding the AudioMixer, Group and FX APIs, thereās so little to comment uponā¦
ā32-bit native plugins will not work in the 64-bit editor. Attempting to load them will result in errors being logged to the console and exceptions being thrown when trying to call a function from the native plugin.ā
from this am i to infer that the 64 bit editor now creates 64 bit executables? or must we use a 64 bit plugin for the editor and a 32 bit for compilations? or (ugh) revert to a 32 bit editor? still a bit confusingā¦
ābut not the web player for the very same security reasons that unsafe code isnāt allowed to run in the web player. Yes Unity does it, no we canāt because that would open up pandoraās box( potential exploits by buffer overflow and the likes ).ā
agreed. however they could institute some sort of certification program if this is the way they want to go with plugins in general. i would have no problem doing that to potentially gain additional platforms shrug. btw greg, is the webplayer going bye bye permanently or just to the new gl version?
itād be nice if Wayne could peruse this thread and throw a few answers our way, or mebbe where all this information is contained? i completely missed the upgrade part that jeff posted. it helps, but it also bring up more questions lol
i was referring to the callbacks in the native sdk. sorry ;(
the callbacks there are called by whom? the audio engine first then scripts orā¦? i was hoping to move all of my āscriptā dll code from .net back to c++ and use filters instead. still trying to figure out the best way to do it, or if i can do it at all, without limiting myself to just 2 platforms.
i mean, i agree, it all sounds great etc, especially the blurbs here:
excerpt:
"But what if you want more DSP control that just the inbuilt effects of Unity? Previously this was handled exclusively with the OnAudioFilterRead script callback, which allowed you to process audio samples directly in your scripts. This is great for lightweight effects or prototyping your fancy filter ideas. Sometimes though, you want the ability to write native compiled effects for the best performance. Allowing you to write more heavy weight ideas, perhaps like your custom convolution reverb or multi band EQ.
Unity now also supports custom DSP plugin effects, with users having the ability to write their own native DSP for their game, or perhaps distributing their amazing effect ideas on the Asset Store for others to use. This opens up a whole world of possibilities, from writing your own synth engine to interfacing other audio applications like Pure Data. These custom DSP plugins can also request sidechain support and will be supplied sidechain data from anywhere else in the mix! Hawtness!"
LOL, i agree, it is hawtness! now how do we do it, and what do we have to give up to do it?
Just tried building the X-Code demo project, targeting 64 bit architecture only. The bundle works fine in my editor.
Targetting iOS is just a matter of building a static lib instead.
I just realised one major issue: in the create callback, we have no reference to the AudioGroup or Mixer the filter is being added to, so itāll be hard to identify which instance of the filter weāre tweaking from our own scripts. One really tacky workaround springs to mind, I hope itās not the only one: use a float ID thatās set in the editor and can be queried via GetFloat API to identify filters. Hmmmā¦
btw, and this just occurs to me, but the end point of the audio filter chain is spatialized, right? onaudiofilterread wasnāt, and thus i had to implement my own hybrid vbap/ambisonics thingyā¦