ShaderGraph Sample3DTexture crashes?

It seems that Sample3DTexture and/or Texture3D files crash in both visionOS builds and Play-to-device… Is there a way to use 3D textures in PolySpatial?
I want to show a slice of a 3D texture on a plane.

I can’t be sure without a repro case or stack trace, but a likely culprit is that your 3D textures exceed the size limit. RealityKit doesn’t actually support 3D textures, so we simulate them using 2D textures with the slices stacked vertically (that is, we take the 3D texture data and reinterpret it as a 2D texture with the same width and a height of the original height times the depth). Because of this, we’re limited to textures where height * depth is less than the visionOS texture height limit (8192 in the simulator, but I believe it’s more on device).

Ok thanks for the info!
So even if my texture3D would go through, it’s not really useable as a 3D texture e.g. no volume rendering and no custom angled slicing I suppose? This is a bit of a problem for our visualization application that involves volumetric data.
For now I converted the Texture3D to Texture2D slices myself and allow the user to scroll through them. This is OK for now.
PS: any other ideas on how to achieve some sort of volumetric rendering are welcome :slight_smile:

If it’s small enough, it should act like a normal 3D texture via the Sample Texture 3D node (that is, we emit MaterialX nodes that sample the 2D texture multiple times to do blending between slices).

1 Like

Thanks, I managed to reduce the Texture3D size and it works with a simple slicing shader in shader graph.
But in the Unity inspector, you get a nice volume preview when selecting a Texture3D in the Project hierarchy with the Volume option. Is there any way to get something similar to VisionOS?

You’re asking how one would reproduce the volume preview in visionOS? As far as I can tell, the volume preview basically just draws a stack of camera-aligned slices (the number of slices is roughly max(height, width, depth) * quality * 2) from back to front, with each slice’s alpha being proportional to 1 / number of slices. Each slice is transformed from camera space to object space before sampling the texture.

So, to reproduce this in PolySpatial/visionOS, you’d need a mesh with however many slices you want arranged along the z axis. Your shader graph would have a vertex component that rotates that stack of slices to face the camera, and the texture coordinates would need to have that rotation applied as well. Something like that, anyway.

Thanks, I managed to create something similar (although it’s not adapting to the viewing angle).

Regarding the Texture3D size limitations, I noticed that a 512x512x37 texture for example gives the error in xcode “Texture too large with dimensions 512 x 18944”
which implies that each 512x512 slice is placed next to each other, horizontally?

Would it be possible that Unity arranges the texture more efficiently to pack more into a large atlas? a 8192 x 8192 texture (this is the limit right?) should be able to house 256 slices that are 512x512?

Vertically. They’re stacked in Y.

It’s possible, but it would require more runtime processing. The reason we use a vertically stacked layout is that it allows us to use the 3D (or cube map) texture data directly, without rearranging it. Aside from alignment on compressed texture blocks, the contents of a 3D or cube map texture can be reinterpreted as a 2D texture with the faces/slices stacked vertically, so we can point to the texture data and load it as-is with a height of (original height * depth), then use custom functionality in shader graphs to sample one or more faces by adjusting the V coordinate.

Assuming Apple doesn’t add native support for cube maps and 3D textures, we could consider adding a different option or mode that would pack the texture more efficiently at the expense of more processing when the texture changes (or even distribute the cube map/3D texture over multiple 2D textures). We’ll add this to the list of things to investigate.

1 Like

Hi, is there perhaps any update on supporting Texture3Ds with a higher resolution?

No; no updates so far.

I would love to ese the code for this. I’m working with volumes in Quest but I’m stuck trying to display it correctly.

I can’t share the code that Unity uses, although it is at least accessible through the editor API via Handles.DrawTexture3DVolume. You can find various resources online about volume rendering; someone linked a few in Discussions a while back, and those may provide a useful starting point.

As a general update to this thread, Apple added support for actual 3D textures in visionOS 2 (and we have included that support in PolySpatial 2), so as of PolySpatial 2, the limits on volume texture sizes no longer apply (and the issues with wrapping should be fixed).