We are working on a WebXR Export package, got the Unity XR SDK, and while implementing it, we noticed that we need a way for Unity to call “UnityPluginLoad” and “UnityPluginUnload” on the native side.
We found a blog post https://blogs.unity3d.com/2017/01/19/low-level-plugins-in-unity-webgl/
which talks about “UnityRegisterRenderingPlugin”.
We tried it, and it only implements the “IUnityGraphics”, but we need a way to register XR Plugin (“IUnityXRInputInterface” and “IUnityXRDisplayInterface”).
As I understand “UnityRegisterRenderingPlugin” is the only hook.
Are there other ways to initiate a call to “UnityPluginLoad” with the needed interfaces?
Is it a bug? feature request? Who should we contact to add a hook for the XR interfaces?
Even though the function is name UnityRegisterRenderingPlugin, it is actually used to register any type of plugin in WebGL builds. That is, something like
Basically the function UnityRegisterRenderingPlugin() is a replacement to the lack of ability to do dlsym() on the web, since everything is statically linked to.
That does look odd. My understanding is that the code should be able to obtain the interfaces (although I have not done any work with the XR API to say for sure)
Does the same code work to obtain XR APIs when you build for native Windows? If so, that does suggest a WebGL bug. If not, that might be a general XR issue.
Oh hmm, now thinking back I realize the whole XR API might (currently) not be available on the web. But maybe it should be for the exact purposes of implementing XR plugins like this.
Maybe will also help to mention, that there’s a file “UnitySubsystemsManifest.json” that ask for “libraryName”.
So I set this to “__Internal”, instead of a library file name… like when using DllImport on WebGL.
The problem is that the XR module subsystems are not available to the WebGL platform. Subsystem requirements are usually declared with UnitySubsystemsManifest.json, but this is not available on WebGL, due to the file system. There will be a lot of work required to get these things to work with the platform limitations WebGL has. Having a WebXR driver for the XR subsystem would be great, I would like to use it, but it will likely be a longer term fix.
Thanks!
Also just noticed that I got a reply in the bug tracker with a sample package.
The sample package didn’t work, and the C/C++ code from my bug project seems to have errors now when it compiles with Unity.
But managed to test the basic use case of getting the XRDisplay and XRInput, and it works!
I guess that I’ll need to compile a wasm bitcode file/lib and use it in the project, instead of raw C/C++ code.
For now I focus on LTS 2020.3.11f1 and LTS 2022.3.10f1.
What works:
Headset/device tracking - works great!
Display rendering - Setting the views and render texture.
What doesn’t work on both versions:
Single-Pass and Multiview - I’m not expecting Multiview to work, but I was thinking that single pass might work.
The RenderTexture of the display is updated only on the first frame, or if there’s another enabled camera on scene with Target Eye set to None (Main Display). In all other cases, the RenderTexture stays the same.
What fails on 2022.3.10f1:
WebGL Warnings on XR mode - “INVALID_VALUE: bufferSubData: buffer overflow”, “INVALID_OPERATION: drawBuffers: BACK or NONE”, “INVALID_ENUM: invalidateFramebuffer: invalid attachment”.
Once on XR mode, and enabling one more camera for spectator mode (Target Eye set to None), performance drops a lot. 2 calls to “bufferSubData” that takes 15ms each, on PC. Even after the extra camera is disabled, that performance drop keeps.
Done some tests with URP.
There are no issues of display not refreshing/update.
Frame rate is ok.
But there are some artifacts in some cases. On AR it’s randomly modify objects to triangles, and on VR it shows one color for an eye when looking on a place with only skybox.
Also linear color space looks dark when entering XR mode.