Hello,
I am loading a geometry with a material from a unity asset at runtime. The material has a normal map assigned. Now I want to load the normal map from the material and save it in a jpg file on the PC.
Requirements: I am running Unity 2018.3 HDRP. The assigned normal map has no compression, so in the inporter settings of the texture, compression is set to none. Of course, the texture was marked as being a normal map.
When I simply load the texture from the material and save it in a jpg using the EncodeToJPG function, the texture is always wrong. After some research I found out, that Unity is saving the normal map in the DXT5 file format. That means, that the red channel is stored in the alpha channel and the blue channel is computed based on the red and green channel in the shader.
Therefore, I wrote a simple compute shader, that loads the normal map from the material as a RGBA texture and creates a new unpackedNormalMap. For computing the blue channel I found a formula on the internet. Furthermore, I read somewhere, that compute shaders load the input textures in gamma color space. Therefore the color is then corrected to work in the linear color space.
However, the result does not look the same as the original normal map. It looks similar but much more stronger. The two maps can be found in the attached files (the size info is not correct because it is only a cutout).
Here one can find the code of the compute shader:
#pragma kernel NormalConverter
// Create a RenderTexture with enableRandomWrite flag and set it
// with cs.SetTexture
Texture2D<float4> Input;
RWTexture2D<float3> Result;
[numthreads(1,1,1)]
void NormalConverter(uint3 id : SV_DispatchThreadID)
{
float2 inputCoord = float2(id.x, id.y);
float4 packednormal = Input[inputCoord];
float3 normal;
normal.xy = packednormal.wy *2 - 1; //Compute correct normal for red and green channel and remap the value [0 - 1] to [-1 1]
normal.z = sqrt(1 - dot(normal.xy, normal.xy)); //Recompute the blue color channel
normal= pow(normal, 1 / 2.2); //Get linear color channel from gamma channel
Result[id.xy] = normal;
}
The whole compute shader is called the following:
private Texture2D sconvertNormalMap(Texture2D normalMap)
{
ComputeShader normalMapConverter = Resources.Load<ComputeShader>("Shader/NormalMapConverter");
int kernel = normalMapConverter.FindKernel("NormalConverter");
RenderTexture normalMapRenderTex = new RenderTexture(normalMap.width, normalMap.height, 24);
normalMapRenderTex.enableRandomWrite = true;
normalMapRenderTex.Create();
normalMapConverter.SetTexture(kernel, "Input", normalMap);
normalMapConverter.SetTexture(kernel, "Result", normalMapRenderTex);
normalMapConverter.Dispatch(kernel, normalMap.width, normalMap.height, 1);
RenderTexture.active = normalMapRenderTex;
Texture2D convertedNormalMap = new Texture2D(normalMap.width, normalMap.height, TextureFormat.RGB24, false, true);
convertedNormalMap.ReadPixels(new Rect(0, 0, normalMap.width, normalMap.height), 0, 0);
convertedNormalMap.Apply();
return convertedNormalMap;
}
I do not know what I could do more. Can pleasy anybody help me what I am doing wrong? Thanks for any input!

