RenderTexture limitation with WebGL?

I have a weird problem since we started with WebGL (january).

Is there a RenderTexture limitation with WebGL?

I have 2 render texture (asset not temporary RenderTexture)

128x128 / ARGB32 / 24 bit depth buffer
1024x1024 / ARGB32 / 24 bit depth buffer

I render multiple mesh and I use that after in a RawImage. On all other platform, the alpha work. But on WebGL there’s no alpha on the 1024x1024. But the 128x128 the alpha is correct.

Is there a limitation the size of the render texture? Or it’s supposed to work?

There is no reason known to me why this should not work. I suggest filing a bug report with a repro case.

It’s not related to the size of the render texture. It was hard to reproduce from scratch. The problem happen with 2 camera created at runtime. I filed a bug: 719438.

im stucking @ that point .. havin an running 2nd camera wich renders at display 2-wich is the displaytype of the rendertexture…–
than export webGL -the browser just gives an black screen where should be the render texture…–

someone solved the problem?

there is no known issue related to render textures at this point. Does it work on GLES platforms?

Just happened with me as well, two cameras (but at the same display) but getting the black screen as result. Not happening on standalone builds.

3 Likes