Camera limitation for the eyes-tracking app

hi,

I had developed a glass-like AR app. But, I did lots of dirty work to walk around the limitation of Camera class.
Because the app tracks the eyes for rendering. The rendering camera’s position no longer aligns with that of the real camera. I am wondering if it is possible to officially support such kind of application.

Here is the app.

I’m not sure I understand - you are moving the camera based on some other input data? If so then yes this will likely result in misalignment. I think I need a better explanation of what you’re trying to achieve to help here.

Yes, I use the selfie camera to track the user’s left eye.

It creates a motion parallax effect for our brain, making virtual objects appear real. It’s a bit difficult to explain, but if you have an iPhone 12, 13, 14, 15, Pro, or Pro Max, you can try it out.

I think this should help you :slightly_smiling_face:

AR Foundation 3.0 came with support for eye tracking.
See the first few messages of this thread:

I do use ARFace to obtain the positional information of the left eye. However, the rendering camera is always aligned with the physical camera. I need to create a separate rendering camera and disable the rendering for both the selfie(for eye-tracking) and main cameras.

Did you try disabling the ARCameraManager and ARCameraBackground components? These components are responsible for rendering, which it sounds like you don’t need for your use case.

1 Like