Graphics.DrawProcedural from OnRenderObject

I’m trying to re-implement Microsoft’s “WpfD3DInterop” Kinect 1.8 sample project in Unity3d 4.5. This project uses DirectX11 to procedurally generate geometry based on the depth map returned by the Kinect sensor. As an added feature, I want the Kinect-generated geometry to be drawn relative to an empty GameObject in the world (representing the Kinect sensor’s translation/orientation).

As near as I can determine, the best way to go about re-implementing this is

  1. Use a Compute Shader to transform
    the 1d short array returned by the
    Kinect into a RWTexture2D and
    write that to a Unity RenderTexture.
  2. Rewrite the MS-supplied shader to use RWTexture2D instead of Texture2D (because I can bind a RWTexture2D using Graphics.SetRandomWriteTarget, but can’t bind a Texture2D).
  3. Rewrite the MS-supplied shader to use uniforms for the modelview-projection matrix and scale properties instead of a cbuffer, as Unity does not appear to have a built-in way to set cbuffer values.

I have validated that step 1 is working by writing a much simpler shader that simply binds the RenderTexture as a sampler and transforms the value into [0…1] half values and renders the depthmap to a quad.

I can’t validate steps 2 and 3, because I can’t seem to make anything draw procedurally. I’m including my OnWillRenderObject() code below – this GameObject also has a MeshFilter (containing a default Quad) and a MeshRenderer attached, and I’ve validated that OnWillRenderObject() is being called by inserting a Debug.Log into this snippet, and observing Debug.Log output.

    public Material computeMaterial;

    private RenderTexture m_depthTexture;

    void OnRenderObject()
        Matrix4x4 mvp = Camera.current.projectionMatrix * Camera.current.worldToCameraMatrix * transform.localToWorldMatrix;
        computeMaterial.SetMatrix("ViewProjection", mvp);
        computeMaterial.SetVector("XYScale", new Vector4(1.2f, 1f));
        Graphics.SetRandomWriteTarget(1, m_depthTexture);
        Graphics.DrawProcedural(MeshTopology.Points, 640 * 480);

and my shader contains

RWTexture2D<int> txDepth : register(u1);

// Constant Buffer Variables
uniform matrix ViewProjection;
uniform float4 XYScale;

I’m not getting any output, or any errors. Is anything obviously wrong with my setup? It’s entirely possible that I’m doing something ridiculously wrong with my drawing code, or that there are bugs in my shaders. Any suggestions appreciated!

I eventually resolved this set of bugs. There’s no single answer I can point to since the result I saw was the result of several errors, but I’ll offer the following observations:

  1. It’s very easy to make typos when referencing compute shader kernels and named buffers from MonoBehaviours. These types of errors fail silently unless you look at the debug output from the shaders in the Unity inspector, and/or print debug output in your MonoBehaviour to make sure the values returned by things like GetKernel() are sane.

  2. RGBAFloat colors are best bitmasked with 0xefff when you’re trying to convert them to unorm floats.

  3. Be very careful with indexing when iterating over 1d arrays that you want to make 2d.