How XR interaction Kit knows the eye position ? [to grab that object]?

good morning,
I only have the VISION PRO simulator, so how can I tell UNITY XR SPATIAL INTERACTION KIT** that the user’s eye is focused on looking specifically at one of the objects?
to interact with the object?
in other words:
How does the Unity editor XR interaction toolkit know how the eye moves in VISION PRO?

How does [[XR_KIT]] know where the user’s eye is pointing to interact with that object?
(any comments are greatly appreciated, thank you)

Hi there! Thanks for reaching out about your interest in Unity for visionOS. You can refer to the input section of the manual for more information about using XR Interaction Toolkit on visionOS. In Mixed Reality apps, you can use the XRTouchSpaceInteractor in com.unity.polyspatial.xr to interact with Interactable objects. This will use the pinch/gaze gesture to manipulate objects. You can see an example of this in the package samples from com.unity.polyspatial– specifically the XRI Debug scene.