Hello, I’m trying to unwrap a mesh according to its UV coordinates from a vertex shader and display it in screen space. I can’t quite figure out why what I’m doing isn’t working out. I apply it to a sphere and i just see nothing in the viewport. On occasion it will simply crash Unity. Any ideas?
I was hoping that I could use it to render out a projected texture (either a unity projector, or using my own custom texture projection matrix) directly onto a texture map. I’ve got a bit of a ways to go with that though. My general idea was that I would render the model unwrapped using this vertex shader, to a RenderTexture using this as a replacement shader.
So why does multiplying by UNITY_MATRIX_IT_MV make this work exactly? I was under the impression that uvs were stored from 0-1 and that we needed to return values in clip space which is from -1 to 1? Why do I need to multiply by this matrix?
And for the record, this only works when I have my camera rotated at an precise angle which is fine, but is there a way to make it so that it is always facing the camera? Just to satisfy my curiosity. The solution you posted should work fine for my needs.
It seems that if I take what I originally had, and just be sure to set the w component to that of the vertex, then it does just fill my screen which was what I was hoping for. So that makes a bit more sense to me now.
Hi, I was just looking into the same thing and found this:
When using the shader included in that link, it unwraps the mesh and shows it in screen space.
What I am missing is how I would project something (a shadow fe.) onto this mesh and have it displayed the same way.
so I can grab the projection (flattened to the uv layout) to be used on the mesh as a texture.
Because right now my projection is shown around the invisible object which mesh is displayed stretched on screen.
The image shows mesh in its unwrapped form (red) and an invisible object in scene view with a shadow from a box projected onto the object (via a script on wiki for projected character shadows)