How to detect eye pupil position and Rotation using camera in Unity

So I am trying to create an environment where the player’s task consists of classifying each object as “friend” or “enemy” and firing a bullet at the enemies while allowing friends to live. If the Player Clicks with the mouse on the object that is currently unknown, an arithmetic formula appears, the right result shows that it is a friend and the wrong shows it as an enemy. The enemy will turn red by subtracting 1 score while the friend turns green by adding 1 score. Player has a cannon at the bottom to shoot them.

The mouse coordinates are displayed on one side.

Now I want to detect the gaze point of the player on screen using a mobile or web camera and display it On the other side () and () eye gaze coordinates are visible. and at the end, I want to compare both values of mouse click and eye gaze coordinates if they are the same.

All of this will be done in unity. Only detecting the Gaze part is left. I need help with detecting the player’s pupil using the camera. Then I will use an algorithm to detect its gaze point on the screen. I am not sure if it can be done with the AR foundation, open cv, or anything else, as I am new to all of them.

From AR Foundation docs:

The AR Foundation package contains interfaces for AR features, but doesn’t implement any features itself. To use AR Foundation on a target platform, you also need a separate provider plug-in package for that platform.
Unity officially supports the following provider plug-ins:

There is no support for AR Foundation on Mac or PC, so AR Foundation is likely not what you are seeking.

Note also that this use case is “face tracking”, not necessarily “augmented reality”, as you are not using the face position to render any visual augmentation of the user’s face. You might have better luck searching for face tracking examples and posting in other forums using those terms.

There are many different tools you could use to implement a face tracking solution. A free tool I have used in the past is FaceOSC by Kyle McDonald: https://facetracker.net/. This solution runs face tracking as a separate process, then sends networked messages using the Open Sound Control (OSC) protocol. To use this solution you would need to receive messages in Unity via OSC, using a library such as extOSC: extOSC - Open Sound Control | Input Management | Unity Asset Store

Of course there are other ways to do this, but here I have highlighted one free and open-source implementation.

Hope that helps!