Greetings,
I would like to get some guidance regarding rendering transparent meshes. I am trying to display Gaussian Splatting data on Unity. Method I chose was to generate quad on position and then apply gaussian splat mask on it. I chose this as I want end product to be rendered in WebGL, which does not support compute shaders and a lot of other things.
So currently I was able to display data from one side in almost acceptable quality:
However when I rotate camera I get this results:
This mesh is generated out of alot of quads, and if I understand correctly transperent meshes are rendered in the order they were provided. If I flip array of gaussian splats, I can see another size of the shoe.
What I want to achieve is rendering just like Opaque rendering, that depends on the position in world space:
This is how my simple shader graph looks like:
Enabling deph writing gives this result:
This is code I am passing data to shader and create mesh:
Blockquote
for (int i = 0; i < positions.Count; i++)
{
Vector3 position = positions[i];
Color color = colors[i];
Vector3 scale = scales[i];
float opacity = opacities[i];
Quaternion rotation = rotations[i];
Vector3[] quadVertices = new Vector3[]
{
new Vector3(-quadSize * scale.x, -quadSize * scale.y, -quadSize * scale.z), // Bottom-left
new Vector3(-quadSize * scale.x, quadSize * scale.y, quadSize * scale.z), // Top-left
new Vector3(quadSize * scale.x, quadSize * scale.y, quadSize * scale.z), // Top-right
new Vector3(quadSize * scale.x, -quadSize * scale.y, -quadSize * scale.z) // Bottom-right
};
// Apply rotation to each vertex
for (int j = 0; j < quadVertices.Length; j++)
{
Vector3 rotatedVertex = rotation * quadVertices[j]; // Rotate around origin
vertices.Add(position + rotatedVertex); // Translate to position
}
// UVs for Gaussian texture
uvs.Add(new Vector2(0, 0)); // Bottom-left
uvs.Add(new Vector2(0, 1)); // Top-left
uvs.Add(new Vector2(1, 1)); // Top-right
uvs.Add(new Vector2(1, 0)); // Bottom-right
// Add color, scale, and opacity for each vertex
for (int j = 0; j < 4; j++)
{
meshColors.Add(color);
meshScales.Add(scale);
}
// Indices for two triangles per quad
int baseIndex = i * 4;
indices.Add(baseIndex);
indices.Add(baseIndex + 1);
indices.Add(baseIndex + 2);
indices.Add(baseIndex);
indices.Add(baseIndex + 2);
indices.Add(baseIndex + 3);
}
//Assign data to mesh
mesh.SetVertices(vertices);
mesh.SetColors(meshColors);
mesh.SetIndices(indices, MeshTopology.Triangles, 0);
// Recalculate normals and tangents, gives no difference
mesh.RecalculateNormals();
mesh.RecalculateTangents();
meshFilter.mesh = mesh;
MeshRenderer meshRenderer = GetComponent<MeshRenderer>();
meshRenderer.material = splatMaterial;
}
So in short I want to make that transperent mesh would be rendered like opaque and not by the order of vertices provided.
I tried rearange elements depending on distance to camera on update, however that brings dramatical performance degradation and quality was not really good. I would like to know is it even possible to achieve?
I already investigted similar projects
but while they succeed, I cannot find a way to apply their logic to mine., mostly as they use compute shading.
I have version with coded shader, which quality is higher, but it still have same problem:
BlazeBin Basic - rnetevgmwbjw <-code
2025/02/04 Edit:
Tested several aproaches, yet no good results yet.
- I tried using Dithered Transparency. While it renders both sides, the quality drop is too big to consider it as good approach
- Tried to sort on update depending on distance to camera, however that is too performance costly as it contains a lot of data points.
- Double checked if normal vectors are looking at correct direction. It seemed correct from what I can say. So Indeed it is the problem of transparent drawing order.
Any opinions, suggestions, tips are very welcome!
Best regards,
Aurimas