Hi, I’m trying to implement a VR finger painting board where a TrailRenderer following the user’s hand gets rendered to a RenderTexture on a plane.
It basically works, but the rendered trail is oddly distorted compared to the original.
Code here:
private void RenderMesh(TrailRenderer trail, Material material)
{
var mesh = new Mesh();
trail.BakeMesh(mesh);
var ttr = trail.transform;
var lookMatrix = Matrix4x4.LookAt(mesh.bounds.center, mesh.bounds.center + ttr.forward, -ttr.right);
var unRotate = Quaternion.Euler(0, 0, 0);
var scaleMatrix = Matrix4x4.TRS(Vector3.back, unRotate, Vector3.one);
var viewMatrix = scaleMatrix * lookMatrix.inverse;
var projMatrix = Matrix4x4.Perspective(120f, 0.5f, 1f, 10f);
var transMatrix = Matrix4x4.TRS(mesh.bounds.center - this.transform.position, Quaternion.identity, Vector3.one);
_com.SetRenderTarget(canvasTexture);
_com.SetViewProjectionMatrices(viewMatrix, projMatrix);
_com.DrawMesh(mesh, transMatrix, material, 0, 0);
}
public void OnWillRenderObject()
{
if (wantClear)
{
wantClear = false;
_com.SetRenderTarget(canvasTexture);
_com.ClearRenderTarget(true, true, Color.clear);
}
Graphics.ExecuteCommandBuffer(_com);
_com.Clear();
}
Trails are drawn in local space, parented to the Plane which displays the RenderTexture.
Attached image is the result. The large line is the user input, and the small copy is the render.
What I’ve come up with so far has been mostly trial and error. I started with the Unity documentation, but I haven’t been able to find a clear explanation of what each of these matrices are supposed to represent. Any thoughts are much appreciated!