[Edit] Solution:
Go to the camera that is drawing to the render texture, enable custom frame settings and turn “Postprocess” off under the Rendering foldout. This is only a problem in HDRP because post processing is disabled by default and I guess it doesn’t allow negative values.
I’m trying to render particles with positive and negative colors to a render texture for 2d displacement but the render texture appears to only receive or store positive values.The shader used by the particle system should be outputting colors darker than black (I tried outputting a constant (-1,-1,-1) to the color output to be sure), and the render texture I’m using is R16G16_SFLOAT so I’m not sure what could be preventing the camera from rendering negative values.
Any ideas? I’m new to using render textures so it may be something basic I’m missing.
Here’s the shader used by the particle system. It’s based off of the shader used to do render texture-based displacement in this writeup. I’m trying to do something very similar to the write-up, but I’m adding velocity to particles rather than moving vertices.
To be absolutely certain negative values weren’t working I output a constant (-10, -10, -10, -10) to the color output in shadergraph. I also tried (-10, -10, -10, 1) in case having a negative alpha might be a messing something up. Either way, the particle velocity remained untouched when the shader ouput negative colors.
Here’s the actual shader I’m trying to use. As you can see, particles are only thrown up and to the right, even though particles that land on the left should get thrown to the left
Ok the solution is to go to the camera that is drawing to the render texture, enable custom frame settings and turn “Postprocess” off under the Rendering foldout.
Hi @keenanwoodall !
I am in a similar situation as you are. I am trying to read the RT from the ShaderGraph, and it looks like there is no negative values there.
I did disable the postprocess on the camera.
Have you tried sampling the RT from a shadergraph, did it work as expected ?
Edit: No, after checking the content of the RT itself, it looks like it doesn’t contain any negative values, so it is the same issue as you had. But for some reason, disabling PostProcesses on the camera doesn’t cut it for me.
Is the render texture set to a format that supports negative values? I just made a test project and was able to get it to work. I will note that I was getting a ‘graphicsFormat != kFormatNone’ error on some color formats, but R16G16B16A16_SFLOAT seems ok.
Thanks for answering ! The RT texture format was correct, but I finally figured out what my issue was: The ColorBufferFormat in our project’s HDRP settings was set to R11G11B10, which is unsigned.
A camera with every custom frame disabled (so the postprocess) rendering in a r32_sfloat render texture as its color buffer without depth buffer (not test). The material is a transparent additive with no depth test, so every fragments should be added in the color buffer.
My material is rendering through the hdrp/unlit node base color input.
Why is it important to setup the Color buffer format of HDRP/rendering section ? Does’nt the hdrp/unlit node should automatically render as r32_sfloat since are put this as the render target of my camera?
BTW it seems no signed choice are available in this combo.
Have similar issue and can’t fix it.(Camera’s post is disabled, render texture cube has signed format, change color buffer of HDRP. none helps
However, it’s easier for just a workaround to convert your negative values into positive value before using it.
For example, I need a cubemap to record the world normal of the scene from the view of a position.
I’ll draw the world normal (from -1 to 1) into a positive value (from 0 to 1), then when sampling the cubemap I will convert it back.