Hi there! I’ve been pulling my hair out all day trying to get a bunch of objects in the HDRP to bake their fragment’s world position into a texture, or a float4[,] array. I believe I have most of this working, but I want to be able to bake large numbers - specifically in the range of -360 to +360, plus a fair number of decimals, for X,Y, and Z.
I can get the objects I want to render properly in UV space (shader below). However when they get to the Texture and I read them again, it appears that Unity is clamping the output values between 0 & 59. I’m not sure when this is happening - if it’s the rendering process, or the limits of the RenderTexture it’s self. I sorta suspect it’s my rendering process, honestly, because this fails even when I’m using a R32B32G32_SINT type texture, and supporting that format and then truncating it would make absolutely no sense what so ever.
Any idea?
The shader I’m using to render objects to a texture is below. I’m basically putting each object one by one on an exclusive layer, making sure it’s visible inside a camera’s frustum that has a RenderTexture as a target, and then I’m manually Rendering the camera. If there’s a better way I’m all ears. As far as I can tell this is just about the only way because I need to rely on the rasterizer to allow me to get at the fragment positions… doesn’t seem possible to do this from within a compute shader…??
v2f vert(appdata v){
v2f o;
float3 worldPos = mul(unity_ObjectToWorld, v.vertex);
o.color = float4(worldPos.x, worldPos.y, worldPos.z, 1);
float2 realUVBroh = v.uv * unity_LightmapST.xy + unity_LightmapST.zw;
o.pos = float4(realUVBroh.x * 2.0 - 1.0, realUVBroh.y * 2.0 - 1.0, 1.0, 1.0);
return o;
}
float4 frag(v2f i) : SV_Target{
//I'm using this weird code as a test. I'm just rendering out an absurd value
//and as the most recent attempt, I'm rendering it to an SINT render texture...
//hence the *1000 below, which of course is to preserve decimal points.
float4 outputColor = fixed4(300,-40.523,355.23,1);
return outputColor*1000;
}