Hi, I’ve been writing some code to update lightmap textures with the GPU.
There is a problem though, in converting the RenderTexture that the GPU outputs to a Texture2D to use as a lightmap.
When using RenderTextures for other stuff, eg meshes etc… the texture in the material for the mesh is just defined as a ‘Texture’, not as a ‘Texture2D’… meaning you can just assign the RenderTexture to it and it works (as RenderTexture is a sub class of Texture). But, because the lightmaps are defined as ‘Texture2D’ and not as ‘Texture’, the only way to get the pixel data into them is by doing a ReadPixels on the RenderTexture… but it’s so slow it completely nullifies any gain in using the GPU to update the textures
Is there some other way of doing this? Can Unity make lightmap textures be ‘Texture’ instead of ‘Texture2D’ in the next release maybe? Or make some kind of constructor for Texture2D that takes a RenderTexture? Or is this something I can do somehow? The only other way I can think of is to write a new lightmap system + shaders to use Texture and not Texture2D… which seems a bit mad…
Or maybe there is some way to get the GPU to render directly into a texture2D ( it’s using a ARGB32 with no depth buffer as the RenderTexture at the moment, so in theory the same internal format?)…Or is there some cast I can do to get the data into a Texture2d? hmm…
Is there a way to hook in to Unity before every mesh/batch that is drawn, to be able to alter the ‘unity_Lightmap’ (Texture) variable in the currently selected material/shader? At the moment, my app is spending 70-80% of the time in ReadPixels, just to convert from a RenderTexture to a Texture2D, so it can put that in the lightmaps array, and then have that cast back up to a Texture again for the shaders
I wonder if getting the address of the RenderTexture object through marshalling, then casting it from a RenderTexture to a texture2D and using that might work…does anyone know? There must be some way of doing this, I just can’t figure it out
You seem to totally missunderstand what a rendertexture is. A RenderTexture is a texture that only exist on the GPU at all, it does not exist on the RAM, while a Texture2D exists in RAM and gets sent to the GPU for drawing.
Thats why casting from one to the other is not possible and why ReadPixels is required, cause that gets the pixels from the gpu into the RAM
Hi… Thanks for taking the time to reply… I don’t need to get the pixels in RAM, I only want to use the RenderTexture as a lightmap when rendering. The reason I am having to do the readpixels/apply is that I can’t just tell Unity to use the RenderTexture as a lightmap source texture - because the lightmap array of textures is defined as Texture2D’s, instead of Textures, hence needing to do the conversion.
It’s possible to use a rendertexture as a texture in materials, just by setting it… this is because the materials are using Textures (parent class of RenderTexture and Texture2D), and not just Texture2D. But I can’t do that for the lightmaps, as they are defined as Texture2D and not Texture, even though ultimately it is defined as a Texture (not Texture2D) in the lightmapping shader !
Well, now I need this as well. ChrisWalsh, did you ever find a solution?
And Dreamora, please read the question again He didn’t ask why he can’t cast RT to Texture2D, but why the array of lightmaps is defined as Texture2D in the first place. Shaders are fine by receiving Textures as texture input, so the lightmaps need not be Texture2D.
IF the array were defined as Texture, we could just use RenderTextures in it, instead of having to update Texture2Ds via ReadPixels (Slooow)