Previously I’d inquired about using 3d textures, and I had that working, however, mipmaps and filtering are completely broken for my use case.
To explain the problem - I need more than 16 textures in a shader. In 90% of the cases I will be using no more than 6 to 12, but in about 10% of edge cases, my terrain will be blending many more textures, 24, 30, for certain meshes.
I’ve been trying to use 3d textures to store sets of textures (diffuse, normal, height, etc) in a 3d texture, and then read in the shader. This is fine, except it requires point filtering and I cannot use mipmaps.
Is Unity going to have any support for Array Textures?
Alternatively, is there any way to adequately use cubemaps as texture arrays, and fetch from one singular face of a cubemap for each call? I explored converting UV’s to a directional coordinate, but I’ve either done it wrong or it is not possible (taking the positive X facing as say, (1, x, y), x,y being the UV coordinates).
Essentially, I want to use more than 16 textures, but texture atlases have mipmap, filtering, and wrapping issues, 3d textures have mipmap and filtering problems, and cubemaps take a direction, not a UV coordinate, and I cannot seem to get a pixel value from them that is actually correct.
The use-case is a voxel-based terrain that uses vertex/uv’s to blend textures, blending multiple texture sets such as grass flooring, rock walls, dirt overlays onto stone, alpha blending vines onto walls with tessellation, and more. Most meshes will use only a few sets, but some cases will involve blending between themed areas that could collide.
My last solution would be to create texture atlases with copied pixels from the opposite side of the texture, so that wrapping is seamless even on mips 1, 2, and 3, and past that any minor bleeding on the padding would not be noticeable. And handling wrapping in the shader between the valid sections (not including the copied pixels), so filtering is all correct. I’d like something cleaner than this, however. Is there any ideas?