URP Camera Post-Processing to Render Texture - scene editor blows up!

Hello World!

I’m having a problem setting up URP for my project. I’m upgrading my Fog Of War from built-in to URP and found that it no longer works due to camera clear flags no longer supporting “Don’t Clear”, which means I’m now resorting to creating a Render Feature and shader that will do what I want. I’m almost done and trying to test my render-feature/shader but when I set the Project Settings\Graphics - Scriptable Render Pipeline Settings to my URP Asset and added my Render-Feature to my URP Asset Renderer Data settings, nothing happens. However, when I go to Project\Settings\Quality and set Render Pipeline Asset to my URP Asset my scene editor goes nuts and I can see the effect I was going for being painted onto the scene editor itself. What the heck?! I can see it doing that to my Game output, but why on earth would it mess with the rendering of my scene editor?

Please pardon my immense ignorance on the subject, I’m new to URP and have a lot to learn ahead of me, but here are a few questions:

  1. Why did it blow up my scene editor? That is, why did it affect the rendering of my scene editor? It treated my scene editor as if it was the output render texture. To me that’s akin to blowing up my visual studio editor renderer; it makes no sense to me, that editor is only viewed by me, the programmer, to edit the game meta, just like the scene editor, nothing I do in the unity rendering pipeline should have any effect on the rendering of my tools, they should only affect the things my tools produce. I’m twilight zone confused, lol!

  2. Obviously I must be doing something wrong. I just want to render the output of my main camera to a render texture, but do some post-processing (via Render Feature & Shader) in between. How can I set this up so it affects only that one camera and nothing else?

  3. This whole thing depends of me being able to:
    A) init a render texture
    B) assign this render texture as an input to my shader, the other texture input being whatever the camera sees.
    C) camera sends my shader output to the very same render texture.

I’m using the same render texture in order to update the previous fog of war with new info. Is this using the same render texture as input and output feasible? Doing it this way in order to simulate Don’t Clear flag.

Many thanks in advance!

Anthony

URP Asset Settings:

URP Asset Renderer Settings:

Here my URP Asset has been assigned to Project Settings\Graphics\Scriptable Render Pipeline Settings, but doesn’t seem to be in effect. I tried changing my camera renderer to my URP Asset Renderer but it doesn’t give me the option, still uses the URP High Fidelity instead:

Didn’t seem to work so I assign my URP Asset to Project Settings\Quality\Render Pipeline Asset (which has the effect of assigning it to my camera as well by default), and my editor blows up! Looks like my shader is being applied to the Scene Editor itself! :

My Shader:
FOW Shader.shadergraph.zip (5.5 KB)

posting some screens of this blowup should help us understand what is the issue

You’re right! Updated my post.

you may have a couple of simultaneous issues here.

first you didn’t set up the URP correctly. for example I don’t see a volume gameobject in scene. Instead on figuring out what you have forgot, you can do that later when you know more about this topic, just create an URP project from the hub, that works out of the box with everything where it needs to be.

second, you need URP shaders, otherwise it doesn’t work, so those shaders, check if they are for URP. there is a converter in unity but may or may not work for custom shaders. usually it doesn’t.

I’ll do this. In the meantime, the shader you see is something I created in the ShaderGraph; I attached it to original post. It’s a URP Unlit Shader Graph.

Okay, thank you for your advice, you were right, and it seems that on top of what you already mentioned, I didn’t add the forward renderer to the existing pipeline’s renderer list, which is why I was unable to choose it as a renderer in the camera. Now that I’ve added it and changed the camera renderer to the forward renderer, it’s working (albeit not exactly); no need to change anything in the Project Settings\Quality tab which led to the blowup in the first place.

That being said, it’s working but not exactly, because the parts of the render texture that should be transparent are not. The shader is definitely outputting alpha, but somewhere in the post-processing it’s not respecting the alpha channel and the render texture ends up completely opaque.

Hmmm… after fiddling around a bit, I found that turning off post-processing on the camera did the trick; transparency is back and it’s all working as expected. It’s applying the shader in the render feature, but my camera’s post-processing is off… I don’t understand, but I’ll take it!

Thank you so much @altepTest!

Anthony

ps: Spoke too soon, it seemed to work but as soon as I moved the FOV or ran the project it stopped working. At least I think I now have the URP set up right so that’s out of the way.

I’ve checked now and the Unlit alpha works. the Lit one doesn’t want to work. i don’t remeber how it needs to be actually. I’ve enabled the Alpha Clipping and that made it work but i have no idea if is the correct workflow.

try using the sprite lit and sprite unlit. the transparency work for me for these two