I think I have found a bug in DX11 in Unity and would like to hear your advice on if this is a bug and how to get around it.
What I have working so far is a point cloud renderer that estimates the size of each point depending on it’s distance to the camera. Therefore, those points that are close to the camera are rendered larger as they are closer, and those poihnts that are further away are rendered smaller. This works perfectly in DX9 and also in OpenGL (v2 & v4). As I move around the PSIZE value of each vertex is set and DX/GL uses this value to draw each point.
However, when I use DX11 it ignores the PSIZE value of a vertex, and all points are rendered with the size of 1 (very small). Typically, and quite disastrously, another component of the project needs DX11, so I need to use DX11.
Can you suggest a work around where each point will be rendered according to it’s PSIZE in DX11? I need it to run FAST so I can renderer all of the content, thanks in advance for any help
Mesh Renderer Setup Code:
mesh.vertices = points;
mesh.colors = colours;
mesh.SetIndices(indicies, MeshTopology.Points, 0);
mesh.uv = new Vector2[pointCount];
mesh.normals = new Vector3[pointCount];
Shader Code:
Shader "Custom/PointCloudEx"
{
Properties
{
_PointSize("Point Size", Float) = 10.0
}
SubShader
{
Tags {"Queue"="Transparent" "IgnoreProjector"="True" "RenderType"="Transparent"}
Pass
{
cull Off ZWrite On Blend SrcAlpha OneMinusSrcAlpha
LOD 200
CGPROGRAM
#pragma exclude_renderers flash
//tells the rederer which function to use to process the rentering of each point (vertex) in the mesh
#pragma vertex vert
//tells the renderer which function to use to process the renderering of each pixel (fragment) in the scene
#pragma fragment frag
#include "UnityCG.cginc"
float _PointSize;
struct VertexInput
{
float4 v : POSITION;
float4 color: COLOR;
};
struct VertexOutput
{
float4 pos : SV_POSITION;
float4 col : COLOR;
float size : PSIZE;
};
VertexOutput vert(VertexInput v)
{
VertexOutput o;
o.pos = mul(UNITY_MATRIX_MVP, v.v);
o.col = v.color;
//Set the point size to be relative to the vertex's distance from the camera
o.size = (1.0/ length(WorldSpaceViewDir(v.v))) * _PointSize;
return o;
}
float4 frag(VertexOutput o) : COLOR
{
return o.col;
}
ENDCG
}
}
}
DX11 doesn’t support point mode rendering anymore according to the documentation, so I’m surprised it works at all. The idea behind that is that it is replaced by geometry shaders. So you’ll have to generate a quad at every vertex yourself using a geometry shader.
Thanks for the response jvo3dc. Ignoring my misgivings of DX dropping this really fast and useful function (boo Microsoft) can you give more detail as to a work around? I’ve gone and looked into a Geometry shader myself and my latest working prototype is below.
However, when I use this shader it makes Unity 5.3 go REALLY slow (2fps on my machine) and when I stop the running scenario Unity crashes (but is solid till I press stop).
I seem to have found the issue, if I remove the line labelled “TROUBLE LINE” then it goes really fast and doesn’t crash, but I need that line to scale the points. Help?
Also found lots of references in the DX11 documentation to point rendering so I wouldn’t say they’ve dropped it altogether. One of Microsoft’s DX11 render modes in D3D11_PRIMITIVE_TOPOLOGY is POINTLIST.
Yes, it is a bit confusing. A similar talk is going on here. So, lets say that as far as I’ve heard, they dropped point mode rendering. But that does not seem 100% true. It seems they just dropped point sprites, which is probably exactly why PSIZE doesn’t work anymore.
When it comes to your shader, you should probably try to just generate the quads in the geometry shader. Then pass along the size to the vertex shader and move the vertices to the right position there.
I haven’t tried to replicate it, but 2 fps does sound really slow. You could also try to not use WorldSpaceViewDir. Just directly take the distance between the world space position of the vertex and the camera. So something like: