I am in the process of converting parts of our game to ECS. For this I need to sample the texture of an object, and I in regular MonoBehaviors I was using:
Texture2D.GetPixelBilinear()
The GetPixelBilinear() requires UV coordinates, and I get these from a raycast against the object’s mesh collider.
I’ve managed to convert the raycasting into an ECS version, however the ECS version of RaycastHit does not contain .textureCoord like the MonoBehavior version. I am quite stuck now, is there any workaround, what’s my best option?
Unfortunately, at the minute Unity Physics doesn’t have an mapping to the graphical representation once the physical colliders are created.
Not sure about your bigger use-case but it might be useful to look in the DOTSSample. I don’t really know that much about the details but it does add a decal to a hit geometry when you fire.
The ActiveUpdate.UpdateJob.EnterFiringPhase function saves the hit position from a Unity Physics cast query. This hit position is then passed to the Robot_Weapon_A_Impact_Generic VisualEffect in VFXSystem.OnUpdate.
Of course this only helps if you can push your logic on to the graphics side and use the world position.
I’m assuming textureCoord relies on mesh uv’s. So the solution here is fairly straight forward it seems even if a bit of extra work.
On conversion extract the vertices and uvs of the mesh. Map those to the collider so you can on hit iterate the vertices to find the closest one. Simple main thread only version a custom struct containing your float3/float2. Dictionary of collider hashcode key with value being a List of that struct. Job friendly version maybe NativeMultiHashMap.
My bigger use-case would be to use the UIToolkit’s RenderTexture to create a WorldSpace UI, which I would control using Raycasts. For that to work, I need the textureCoord from the RaycastHit and pass it to some code that simulates a mouse on that position on the UI.