Hi,
Can someone please tell me how Unity manages display distortion correction for common AR/VR hardware such as the HoloLens and other headsets? Am I right in thinking that a Unity Application targeted for one of these platforms performs the display distortion correction (warping) of the left/right images or is this typically done by the hardware itself? Assuming the former, how is Unity configured to provide the correct distortion characteristics for the headset type?
Iâm under impression that unity does not deal with it directly and instead forwards the data to whatever underlying SDK is being used and it handles the distortion. Meaning if youâre using Oculus then it knows its own distortion coefficients, whatâs more distortion is not performed by unity, but by oculus runtime.
You cannot directly alter distortion coefficients, the best you can adjust is projection and view matrices for each eye, and there are many limitations (trying to set view/projection matrices in VR can easily cause unity to render nothing or produce no effect).
For VR application rendering generally works like it would in a flatscreen games, meaning each eye gets a normal projection matrix for specific fov. However, both eyes can be rendered at the same time.
With foveated rendering, situation can be different, but I have no access to such headset.
The exception is google cardboard. In case of google cardboard (I think it is deprecated, by the way), device parameters are set in a device profile which is provided in a very awkward and roundabout way through a link to a google protocol buffer or something like that.
Distortion is done by headset driver or hardware. Otherwise you would have the burden on supporting each deviceâs hardware specifics on Unity, and that wouldnât be a good approach for both Unity and hardware creators. It just makes more sense to have that distortion correction tied with the hardware and therefore I assume thatâs what it is.
Thanks for your reply. I understand your reasoning. But surely the device would be using the hostâs GPU to perform that distortion correction? In which case, Unity must be passing each eyeâs image frame (via software) to some form of driver software. So where is this interface?
Device driver and API for interfacing with it. Thereâs no point in making your own SDK, unless youâre oculus. Which means âunless youâre a well-known VR manufacturerâ
The issue here is that VR headset is not something like an LCD monitor. At least at the moment.
I believe when people were creating custom headsets, they were using Steam VR to interface with them.
Iâd like to elaborate something. The point of making a custom headset is that so you can play stuff on it. Stuff made by other people. In order for that to happen, the headset has to work through Oculus software or SteamVR.
You donât have to do it this way, you could absolutely make a completely custom hardware that is seen as a display by an operating system and do everything by yourself on it from scratch, including lens distortion correction.
However, people wonât be able to play existing VR games on this sort of device, because it canât talk to oculus or steamvr.
Correct me if I were wrong, Niginfinity is basically saying that it is not doable to alter the distortion coefficient in a Oculus headset.
Iâve tried to get the gpu buffer as a texture 2D, from there I can get the distortion map if comparing to the display screenshot. So I believe we can preDistort (counter-distortion) the raw image to achieve the wanted customized distortion result?
Even better, can we disable the built-in Oculus run-time distortion, and display our own pre-distorted image?