Correct way to initialize Texture3D for use as Texture3D<float> in Compute Shader?

I’m trying to find the right way to initialize a C# Texture3D object with float values between 0 and 1 for it to be properly passed to a compute shader as a Texture3D (specifically not ) object. To be honest, I haven’t experimented much because learning this kind of stuff by trial and error is nearly impossible. I know that what I really need is a single-channel texture, and I’ve found TextureFormat.Alpha8 and RFloat, but the documentation is so bad it’s practically useless, so I can’t tell if those actually make the texture single-channel or not. If I initialize with one of those and pass the texture to the shader, will it work as I expect it to?

1 Like

Those are both single-channel formats, although with very different levels of precision. If you only want values from [0,1] with decent precision, then something like R16 is probably more appropriate.

Under the assumption that this is just a fixed input 3D texture, initialisation is also easy enough with SetPixels. If you actually want to write to the texture in the compute shader, then you’d want to use render-textures instead.

SetPixels()
https://docs.unity3d.com/ScriptReference/Texture3D.SetPixels.html

The documentation is a little iffy there, but generally if Unity supports the creation of the texture format, you can set it using SetPixels, regardless of what the documentation may say. The main caveat is you can’t use SetPixels on compressed texture formats.

R16 is useful since it holds values between 0.0 to 1.0 with linear precision. Other alternative would be RHalf, which is also only 16 bits like R16, but is a proper floating point value that can store negative values if needed.

R8, R16, RHalf, and RFloat should all work with Texture3D. I’m not sure about Alpha8.

1 Like

I hope it’s not against etiquette to resurrect an old thread, but I seem to be having trouble reading any Texture<float> (2D or 3D) in a compute shader; I’ve tried to access R8, R16, RHalf and Alpha8 with [ ] operator. Previously with Alpha8 I’d tried that operator as well as sampling and using the Texture.Load() method (and stepping through the code with RenderDoc in case that shed any kind of light)… I just don’t seem to able to extract any value other than zero from a single channel texture in a compute shader (I should do more testing in non-compute shaders to see if that is relevant).

edit: to be clear, equivalent code with Texture2D works, whereas Texture2D doesn’t. I’m ultimately trying to get things working with Texture3D, and had previously assumed that my problems were to do with Texture3D, but it seems that is the problem (at least for me, both on Windows and Mac, various different Unity versions…).