I’d like to be able to take a sprite and set the four corners of its bounds however I like — or at least, apply an arbitrary (perhaps non-affine, e.g. perspective) transformation matrix to it. For example:
If I could get at the 3D mesh that probably underlies the sprite, I could just move the vertices around. But I don’t think that’s possible (is it?).
So maybe I have to use MeshRenderer instead of SpriteRenderer. But then I’m not sure how to retain the features of SpriteRenderer that I need, particularly sortingLayerID and sortingOrder. I guess I can probably use a MaterialPropertyBlock to set the texture and color (tint) on a per-object basis at runtime, substituting for SpriteRenderer.sprite and .color.
So I suppose my question comes down to: how do I make a MeshRenderer sort properly (honoring sortingLayerID and sortingOrder) with others, and even with sprites in the same layer? And is there any easier way to accomplish such stretching of a sprite?
My goal is to create something like Doom, where squares are drawn in perspective to give a 3D appearance of walls, floors, etc… but to do this all in a flat plane with an orthographic camera. As you can see above, I don’t quite get the right effect when deforming a quad mesh, because the interpolation across the triangles isn’t uniform. If the triangles had cut across the quad the other way, I’d have gotten a different (and still wrong) result. But I don’t want the triangles to be visible at all.
I could hide this problem by using a denser mesh — the more triangles I use, the less noticeable the effect will be. But that seems heavy-handed. Is there some way with a custom shader, or transformation matrix, or some other trick, that I can get a renderer to do a more uniform interpolation across a quadrilateral?
EDIT: I think the answer is to use homogenous (4-element) UV coordinates and a custom shader, as somewhat described here: https://answers.unity.com/questions/1403638@Bunny83 , thanks for that answer, and don’t be surprised if I end up needing help…
That’s not 3D — it’s a flat mesh, with its corners being set to specific screen positions by code, and viewed with an orthographic camera. Thanks to the homogeneous UV coordinates trick pointed out by @Bunny83 , the texture stays nice and smooth across the surface of the 2-triangle mesh even as the shape is deformed.
Well, nuts. Just as I go to ship this, I find that while it usually works, at certain corner positions it does not:
Above you can see what happens as I drag that rightmost corner up and down. We seem to be hitting some sort of singularity in the math as it crosses horizontal with the top left corner. Note that the triangle is fairly “fat” (far from degenerate) the whole time. And yes, this is using homogenous UV coordinates as above; it’s only certain shapes that trigger this (but they do come up in real perspective projections sometimes).
Anybody have any ideas what might be causing this, and how to fix it?
EDIT: Never mind, it was my bug. (I was using the horizontal and vertical differences in coordinates, rather than actual edge lengths, to construct my homogenous UVs.)