I’ve been working on a multi-platform native plugin for a while and I’m hitting a thick brick wall pretty hard right now on Android.
For a setting of the native rendering I’m doing I need to access the camera’s depth buffer/texture, alas either I’m dumb or neither of the solutions provided in the documentation/replacement shaders package work for Android (it does work for GL2/DX on mac/win, though).
I’ve looked around in the forum/answers thing/doc and all the info I could find was either confusing and/or just didn’t work, so my question is simple : As of today, with the latest Unity Pro version, Is there a way to natively access a camera’s depth texture in Unity on Android?
I don’t care if it’s through a RenderTexture, a native GLES texture handler (I’d love that one, actually), an ugly workaround or whatever, I just need that thing.
Any help muuuuch appreciated, I’m starting to get quite mental over this problem
now, what do you mean by natively access?
if you are fine with getting gles texture id i would advise doing smth like that (yes, it is a bit clunky and not awesome, but oh well):
the main problem is that currently RenderTexture will return you color texture id (not really - but stay with me for a sec). So what you can do is:
create color-only RT : RenderTexture rtColor
create depth-only RT [depth only RT should return its depth surface id] : RenderTexture rtDepth
now tell your camera to render to color and depth coming from different textures:
Camera.SetTargetBuffers(rtColor.colorBuffer, rtDepth.depthBuffer);
render your scene
now, rtDepth.GetNativeTextureID() should be the id of your depth texture.
I remember there were some bugs about it - but you can try and bug report with repro whatever issues you hit (drop case number here)
EDIT: though be warned, as long as you request 24bit depth we actually try to create d24s8, so please check OES_packed_depth_stencil extension for caveats.
I get the texture ID from the Start() method, before actually rendering with the camera. Interestingly enough, I get an invariable value of 15 for the texture id. So yeah… That doesn’t sound right.
I’ll get back to it on Monday and refactor my code to make it painless to pass the id after rendering with the depth camera.
In the mean time, feel free to tell me if I fucked it up already so far.
hm, that is actually a corner case here - we do manually call Create if needed on Camera.targetTexture, but in case of using buffers directly - i would say you better off doing it yourself
well, you can do GetNativeTexturePtr which will return gluint casted to IntPtr already
Why? for your usual RenderTexture there is no need to do double-buffering magic or the likes - so it actually should be constant, yes 8). If you mean not changing even before Create - again, gl “do not care” about texture being created or not - the moment you call glGenTextures (which we do in RT ctor iirc) you have id that you can use. It will be fully inited later in Create.
So, I tried several things and I’m still confused.
My depth RenderTexture format was set to RenderTextureFormat.Default, which resulted in it being full of a bunch of crap, so I set it to RenderTextureFormat.Depth which worked in the editor but gave me some “RenderTexture.Create failed: format unsupported” at runtime, regardless of the “number of bits in depth buffer” specified to the ctor. I reckon that is related to the lack of that OES_packed_depth_stencil extension on my device?
For the record, this method most likely works, granted your device supports the gl_oes_depth_texture extension, which can be checked with SystemInfo.SupportsRenderTextureFormat(RenderTextureFormat.Depth).
It’s not the case on Tegra, at least up to Tegra 3.