I’m sure I’m missing something obvious here, but I’ve been banging my head against this for a while now.
In my UI shader I am using a global Vector to mask out an area. I do this with the help of float d = saturate(distance(IN.worldPosition.xy, _Target.xy) * _MaskSize); in my fragment shader.
However, for further effects I also need to know which local UV coordinates correspond to this world space _Target.xy position. I.e.: I need to know, locally, where this target is. How would I convert between the two spaces here?
Thanks in advance!
EDIT: I got my specific use-case working, but would still love to hear an answer to this out of curiousity
For something like UI, then you need to know the transform from “world” space to “uv” space, which is going to be the UI object’s local space with some additional scaling and positional offset to account for the UV. The solution is to calculate this in c# and pass it to the material. But this only really works for UI elements which are essentially guaranteed to be flat, uniformly UV’d quads or sprites. On more complex geometry you’d need this information per triangle as the UVs are arbitrary. At that point the “solution” basically requires you have the entire mesh’s data accessible in some to search through, and with arbitrary meshes there’s not necessarily one solution since UVs and geometry might overlap.
Basically you’re describing the primary problem any 3D model painting program has to solve. And some of them handle it by punting on the problem and baking the mesh data into 3D look up textures (aka “voxels”) that you’re actually painting and then reproject back onto the mesh. And others solve it by being dog slow and buggy.