I’m just getting into Unity, my experience lies mainly with Blender.
I would like update UV coordinates in realtime, projected from a camera. In Blender to do this (not in realtime) you can ‘Project from view’. To do something similar in Blender in realtime you can similarly change the map input to use screen coordinates.
Assuming you want this in real-time, fastest (in machine time) way is with a shader (on the Material.) The shader is already computing Screen coords. Not to much extra work to replace the UV coords with those new numbers.
The Unity shader drop-down doesn’t have a “UV from view” shader, but there are various shader snippets floating around. If you can’t find one mostly written, learning shaders isn’t trivial – not worth it for just this one effect.
It is also possible to modify mesh.UV (see the Mesh class) each frame. One obvious problem is that multiple instances of the mesh will have different camera views. You’d probably need to make copies. Changing mesh.UV is “slow” in the sense that you may lose one whole frame/sec(?)
You’d just need to pull out the corrosponding xyz vert, apply the parent transform to get world space, then Camera.WorldtoView to get screen space. Then scale that however and set UV. That seems like more work, but there may be more, easier to read, examples of mesh manipulation than for shaders. Plus, playing with model/world/Camera/Screen coords is good practice if you do want to learn shaders.