I’m curious what the draw order is when using render targets because I’m seeing some weird behavior in Unity 4.3 . Respectively, when you have transparent objects that get rendered by a camera, and then the camera has a post processing effect, the draw order is fucked up.
What my setup is : I render opaque and transparent objects from Camera 1, I then have an image filter that combines Camera1 with Camera 2 (which renders it’s own objects).
Whenever I disable the filter on Camera 1, everything looks ok, when I add the extra filter, transparent objects are not rendered.
However, I found a small fix : have all transparent objects render in the Geometry+1 que. If I add this to all transparent shaders, everything renders correctly ! However, this means modifying all transparent shaders + built in ones to make everything work correctly which doesn’t sound feasible. It seems as though the render target filter draw call is getting called BEFORE drawing transparent objects which sounds stupid. Is this a bug ? Is it Intended ?