Get high precision output from shader

I want to render high precision float values to a texture, but the values I read are clamped to [0, 1]. Is there anyway to get unclamped values?

Here is the fragment portion of the shader code:

float4 frag (v2f i) : SV_TARGET
{
    return float4(0.098, 0.698, 200, 100000); // Obviously a contrived example.
}

Here is the code creating the texture to render to:

RenderTextureDescriptor rtd = new RenderTextureDescriptor(1, 1, RenderTextureFormat.ARGBFloat);
RenderTexture rt = RenderTexture.GetTemporary(rtd);
RenderTexture.active = rt;
       
mainCamera.SetReplacementShader(selectionShader, "");
mainCamera.Render(); // OnPostRender will be called
mainCamera.ResetReplacementShader();

Read code (in OnPostRender):

Texture2D hitTexture = new Texture2D(1, 1, TextureFormat.RGBAFloat, false);
Rect rect = new Rect(0, 0, 1, 1);
hitTexture.ReadPixels(rect, 0, 0, false);
          
Color pixelSample = hitTexture.GetPixel(0, 0);
Debug.Log("Sample: " + pixelSample);

The output is “RGBA(0.098, 0.698, 1.000, 1.000)”, rather than “RGBA(0.098, 0.698, 200, 100000)” as hoped/expected.

Well, range and precision are not quite the same thing. ARGBFloat textures may store values with a range of 0-1 but at precision (accuracy) of 32 bits per channel.

You can “normalize” values to their 0-1 range by dividing by a maximum value. Multiplying the result by the same maximum will restore the original range.

Normalize;
(0.098, 0.698, 200, 100000) / 100000 = (0.00000098, 0.00000698, 0.002)

De-normalize;
(0.00000098, 0.00000698, 0.002) * 100000 = (0.098, 0.698, 200, 100000)

No. ARGBFloat stores a 32bit signed float value with a range of roughly +/- 3.402 Ă— 10^38.

I believe the problem is GetPixel itself, or maybe Color itself is clamping the value. I honestly do not know the solution. There’s an issue on the Issue Tracker with this exact bug, but it was supposedly fixed in 5.0. What version of Unity are you using? There’s a weird chance GetPixel is broken but GetPixels works.

Generally speaking though, if you’re reading back a texture on the CPU you’re “doing it wrong” for performance.

2 Likes

I’m using Unity 2017.1.0f3.

I’m aware of the performance issues of reading texture to CPU. It’s for raycasting in absence of a collider.

You’re right! It’s the 8bit flavour that has 0-1 unit range. My apologies @AbleArcher , it’s been a looong day :slight_smile:

1 Like

I did try GetPixels (plural) and GetRawTextureData. Both were clamped. I would be interested to see that issue in the Issue Tracker.

Anyways, thanks for your thoughts. For various reasons (including confusion about why I couldn’t find my question), I posted this question on answers.unity3d as well. I’m linking it here for those interested: