I’m trying to have the HoverEffect working in VR and I can’t make it work.
I’m using URP, I add the PolySpatialHoverEffect on my gameobject that have the renderer, I add a collider, and it does not work.
I’m able to use the VisionOs Pointer to do interaction but without the HoverEffect it is useless.
Is there anyway to make it work in a Fully Immersive apps?
I do not believe the PolySpatialHoverEffect works in VR fully immersive apps. It relies on PolySpatial’s architecture to puppet RealityKit’s hover effect, and PolySpatial is not used in VR. You may have to use skeletal hand data and a visible line originating from your hands to “aim”.
@vcheung-unity Do we have any other way to do this other than the above solution. Because eye gaze is the one of the Unique feature in Vision Pro. If we use ray kind of the thing, looks not good for Vision Pro.
you only have access to eye gaze when user does an action like click. Running into the same problem but there’s nothing we can do about it because the app runs on unity’s pipeline rather than apple’s
As mentioned, eye tracking (eye gaze) data is only available on the first frame of an input (like a pinch gesture). This is a VisionOS constraint to protect user privacy. Please share feedback with Apple if you would like this to change.