Compute Buffer + Render Texture

Hey all!

First off AMAZING work. Truly blown away by the amount of breath & depth thats going into polyspatial!

Second! I’m trying to get compute shaders + render textures working, but for the life of me am unable to…

Currently my process is: Make a floating point render texture, run a compute shader to fill it, set it dirty, and do a basic sampling via shadergraph. It works great in unity, but when built, either doesn’t show up, or i get this:

Thread 1: Fatal error: Unable to init drawableQueue: unsupportedFormat

Does this mean compute buffers just arent supported? or or that floating point textures dont show up, or something else im doing wrong…

Thanks a billion yall are the sweetest and the best :slight_smile:

1 Like

Thanks!

The process you’re describing sounds correct, and should work. However, the API we use for supporting RenderTextures only supports a limited number of formats, and the only floating point format that we’ve tested is R16G16B16A16_SFLOAT (aka, RGBAHalf). We’ve recently had some apparent success with RFloat, so that might also work.

If the issue persists when using either of those formats, feel free to submit a bug report with a repro case and let us know the incident number (IN-#####), and we can take a look to see what else might be going wrong.

2 Likes

sounds good! seems like its probably the float texture as im using r32g32b32a32_sfloat. Testing w/ 16 for a repo case :slight_smile:

1 Like

Seems like the RGBAHalf did it!, Repo case for rgbafloat is super simple, just make a rendertexture with rgbafloat format, plop it on a quad, and it should break the build :stuck_out_tongue:

2 Likes

@kapolka Been cranking on this all day, thanks to you I’ve got the render texture passing through to simulator!!! TYSM!

I am seeing that on simulator vs in unity, i cannot seem to get the filtering to be point in simulator ( looks like its probably linear ), whereas in unity its point. Not ultra neccesary for my use case, but i was wondering if there are certain filtering formats that are supported vs/ not supported

Since you mentioned that you’re using shader graphs, it’s worth pointing out that the sampler state (repeat/filter) has to be set through a sampler node in shader graphs. We don’t obtain the sampler state from the texture itself (basically, because there’s no good way to pass it as a parameter to a RealityKit ShaderGraphMaterial).

1 Like