very simple fragment shader produce different color on different platforms

After reading the color of RenderTexture into Texture2D with ReadPixels(), I query the color from Texture2D using GetPixels() I found the result is platform dependent! With the same constant color (say 0.5,0,0,1) specified in the fragment shader, the results differ on desktop(0.498,0,0,1), tegra2(0.482,0,0,1 and 0.518,0,0,1 pixel-interlaced!) and mali400(0.498,0,0,1 and 0.502,0,0,1 pixel-interlaced!).

Desktop: (0.498,0,0,1)
1274832--56975--$T2_desktop_256_capsule.png

Tegra2: (0.482,0,0,1) and (0.518,0,0,1) pixel-interlaced!
1274832--56976--$T2_tegra2_256_capsule.png

Mali400: (0.498,0,0,1) and (0.502,0,0,1) pixel-interlaced!
1274832--56977--$T2_mali400_256_capsule.png

Tegra2 vs Desktop zoomed in

Source Code: 1274832–56982–$T2.unitypackage (15.3 KB)

Shader "Custom/RenderT2" {
	Properties {
	}
	SubShader {
	Pass {
			CGPROGRAM
			#pragma vertex vert
			#pragma fragment frag
			#include "UnityCG.cginc"

			struct v2f {
				half4 pos : SV_POSITION;
			};

			v2f vert( appdata_full v )
			{
				v2f o;
				o.pos = half4(float2(v.vertex.xy), 0, 1);
				return o;
			}

			fixed4 frag( v2f i ) : COLOR
			{
				return fixed4(0.5f,0.0f,0.0f,1.0f);
			}

			ENDCG
		}
	}
}
using UnityEngine;
using System.Collections;
using System;

public class Script_T2 : MonoBehaviour {
	
	public bool debug_output_c = false; 
	public bool debug_output_png = false; 
	
	// Use this for initialization
	void Start () {
		camera.SetReplacementShader( Shader.Find( "Custom/RenderT2" ), "" );
	}
	
	// Update is called once per frame
	void Update () {
	
	}
	
	void OnRenderImage (RenderTexture source, RenderTexture destination)
	{
		Graphics.Blit(source, destination);
	
		RenderTexture output = camera.targetTexture;	//source
		RenderTexture.active = output;
				
		// dump rendertexture
		if(!debug_output_png)
		{
			Texture2D tex2D = new Texture2D(output.width, output.height, TextureFormat.RGB24, false);
	    	tex2D.ReadPixels(new Rect(0, 0, output.width, output.height), 0, 0, false);
	    	tex2D.Apply();
			
			byte[] byt = tex2D.EncodeToPNG();
			
			string path = Application.persistentDataPath;
			path = path.Remove(path.LastIndexOf('/')) + "/";
			System.IO.File.WriteAllBytes (path + DateTime.Now.ToString("yyyyMMddHHmmssfff") + ".png", byt);
			debug_output_png = true;
		}
		
		// dump renderteture color
		if(!debug_output_c)
		{
			Texture2D tex2D = new Texture2D(output.width, output.height, TextureFormat.RGB24, false);
			tex2D.ReadPixels(new Rect(0, 0, output.width, output.height), 0, 0, false);
			tex2D.Apply();
			
			Color[] ca = tex2D.GetPixels();
			
			for(int i=0; i<output.width; ++i)
			{
				int idx = output.width/2+i*output.width;
				print("ca[" + idx + "]: " + ca[idx]);
				//print("bilinear ca[" + idx + "]: " + tex2D.GetPixelBilinear(0.5f,((float)i+0.5f)/(float)output.width));
			}
			debug_output_c = true;
		}
		
		RenderTexture.active = null;
	}
}

I’m not sure if this RenderTexture to Texture2D process went wrong or something else is wrong. Could it be related to the half-pixel as D3D9? anyway, accurate color output is important for shader debug, right?

More tests:

  1. Does not render the camera into texture and set the capsule with material using the above shader, and then Application.CaptureScreenshot() in OnPostRender on both desktop and tegra2. The resulting png file on tegra2 also suffers from “pixel-interlaced color”.

  2. change the constant color in shader from (0.5f,0.0f,0.0f,1.0f) to (0.2f,0.0f,0.0f,1.0f), so that the red channel is integer 51 with respect to 255. Color on desktop is (51/255,0,0,1) while color on tagra2 is not pixel-interlaced but inaccurate value (49/255,0,0,1)

This may be down to a 16 bit back buffer:

Start with 8 bit red channel:
51/255 = 0.2

Convert to 5 bit red channel in 16 bit back buffer:
0.2 * 31 = 6.2
This gets rounded to 6 giving you 6/31 = 0.1935

Convert back to 8 bit red channel when dumping out the png:
0.1935 * 255 = 49.3548
This gets rounded to 49 giving you 49/255

Edit: I imagine that the pixel interlacing/dithering works by biasing the threshold that determines if a value gets rounded up or down. Adjacent pixels will have different biases.