Blit to RenderTexture in DX11 Mode

[Solved] If you add ZTest Always to the shader used for the Blit, it will work like it does in DX9 mode. I assume that newly created fragments were seen as ‘behind’ the existing pixels, and that when rendering to the RenderTexture with a camera, those scene fragments were far enough away from the camera for the blit to be in front of it.

[Original Post]
In the regular mode of Unity4, processing a texture with a shader is as simple as this:

RenderTexture.active = targetTexture;// Remind the engine which render Texture should be used. Blit only seems to set _MainTex and targetTexture if they were not yet set(feature/bug ?)
testMaterial.SetTexture("_MainTex", sourceTexture);
Graphics.Blit(sourceTexture, targetTexture, testMaterial);

This renders a quad with testMaterial into targetTexture. Very useful when performing calculations on the gpu in DirectX 9 or OpenGL.

But when I restart Unity in DirectX 11 mode, targetTexture remains black after performing the blit. Unless I render to it with a camera first.

Here’s code I used to test blitting differences in the regular Unity, and the Unity DX11 Renderer:

using UnityEngine;
using System.Collections;

public class BlitTest : MonoBehaviour {
    
    public Texture sourceTexture;
    public RenderTexture targetTexture;
    public Material testMaterial;
    
    public Camera someCamera;
    
    public void testBlit(){
        Debug.Log("Performing Test Blit");
        
        // All these properties remain the same after 'fixing' the renderTexture.
//        Debug.Log("Is Created: " + targetTexture.IsCreated());
//        Debug.Log("Power of 2: " + targetTexture.isPowerOfTwo);
//        Debug.Log("sRGB: " + targetTexture.sRGB);
//        Debug.Log("Depth: " + targetTexture.depth);
//        Debug.Log("AnisoLevel: " + targetTexture.anisoLevel);
//        Debug.Log("Filtermode: " + targetTexture.filterMode);
//        Debug.Log("Format: " + targetTexture.format);
//        Debug.Log("Color Buffer: " + targetTexture.colorBuffer);
//        Debug.Log("Depth Buffer: " + targetTexture.depthBuffer);
//        Debug.Log("Enable Random Write: " + targetTexture.enableRandomWrite);
//        Debug.Log("isVolume: " + targetTexture.isVolume);
//        Debug.Log("volume Depth: " + targetTexture.volumeDepth);
//        Debug.Log("wrapMode: " + targetTexture.wrapMode);
        
//        Debug.Log("Instance Id: " + targetTexture.GetInstanceID());
        
//        Debug.Log(targetTexture.width);// You'd say that at this point I'd learn to do proper profiling in Unity...
//        Debug.Log(targetTexture.height);
        
//        Debug.Log(targetTexture.GetNativeTextureID());
//        Debug.Log(targetTexture.GetNativeTexturePtr());
        
        RenderTexture.active = targetTexture;// Remind the engine which render Texture should be used. Blit only seems to set _MainTex and targetTexture if they were not yet set(feature/bug ?)
        testMaterial.SetTexture("_MainTex", sourceTexture);
        Graphics.Blit(sourceTexture, targetTexture, testMaterial);
    }
    
    /**
     * In DX11 mode
     * For some reason it is not possible to render to RenderTextures with blit, unless they have been rendered to by a camera.
     **/
    public void prepareTexture(){
        someCamera.targetTexture = targetTexture;
        someCamera.Render();
        someCamera.targetTexture = null;        
    }
    
    /**
     * In DX11 mode
     * For some reason it is not possible to render to RenderTextures with blit, unless they have been rendered to by a camera. So to 'murder' it, simply recreate it.
     **/
    public void murderRenderTexture(){
        targetTexture.Release();
        targetTexture.Create();
    }
    
}

(prepareTexture and murderRenderTexture are called from a button in an inspector. I’m incredibly proud of how consistent I was in naming them ;))
I tried to find a difference between a fresh RenderTexture and one that had been rendered to by a Camera, but I haven’t succeeded yet.

I guess the main way to use the GPU for calculations in DirectX11 is to use compute shaders, instead of regular shaders, but I was hoping that I would be able to use at least a portion of the shaders in both DX9 and DX11.

What does a camera do to a RenderTexture that allows it to be written to, when Unity uses the DirectX 11 renderer?

I am seeing the same thing happening. Your post helped me work around the issue which only happens in dx11.

I am also seeing my level runs much slowing when i set the editor to dx11 mode. in dx9 mode I am receiving an increase of perf multiple times better. unfortunately, I dont want to run in dx9 mode because i cant use dx11 features. (these levels only have dx9 features in both for the tests.)

As scrawk points out in this thread, adding “ZTest Always” to your shader makes Blit work like it does in DX9.

Sounds like the same situation as on iOS: http://forum.unity3d.com/threads/169617-iOS-image-effect-trouble

Perhaps there is something more correct about setting ZTest Always, and the old APIs were just more tolerant.

It indeed seems like the same problem. I wonder if it’s more correct or just different.
In any case, I think it might be useful to have the requirement mentioned in the documentation for Graphics.Blit.