Meta Quest 3 - ARCameraManager and getting light estimation (frameReceived not called)

Hey,

I'm trying to figure out how to get the light estimation at runtime.

Using this code but FrameChanged never gets called. Is this the correct way of doing it or am i missing something?

Passthrough and everything is working as expected in a build.

Thanks in advange!

        void OnEnable()
        {
            if (cameraManager != null) {
                Debug.Log("ApplyLightingConditions: OnEnable() Add Action FrameChanged");
                cameraManager.frameReceived += FrameChanged;
            }
        }

        void OnDisable()
        {
            if (cameraManager != null) {
                Debug.Log("ApplyLightingConditions: OnEnable() Remove Action FrameChanged");
                cameraManager.frameReceived -= FrameChanged;
            }
        }

        void FrameChanged(ARCameraFrameEventArgs args)
        {
            Debug.Log("ApplyLightingConditions: FrameChanged()");

            ARLightEstimationData lED = args.lightEstimation;
         }

Check the documentation for quest in ar foundation. It's not supported yet

hey thanks. can you share a link where this is stated?
i was referring to https://docs.unity3d.com/Packages/com.unity.xr.meta-openxr@1.0/manual/index.html where "Render images from device cameras and perform light estimation." for the camera is listed.

Owh, did not see that.
If it doesn't work while docs say it should, file a bug report so it can be fixed

Maybe someone from Unity can confirm if this is a bug or just not implemented for the meta quest 3

There is this method but it returns false also in an Android build

https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@5.1/api/UnityEngine.XR.ARSubsystems.XRCameraSubsystem.TryGetLatestFrame.html#UnityEngine_XR_ARSubsystems_XRCameraSubsystem_TryGetLatestFrame_UnityEngine_XR_ARSubsystems_XRCameraParams_UnityEngine_XR_ARSubsystems_XRCameraFrame__

Light estimation for Meta devices in AR Foundation is not currently supported. I can see why there's confusion here; the description that you linked to ("Camera: Render images from device cameras and perform light estimation.") is a standard description for the Camera feature and does not necessarily indicate that the provider you're using (in this case Meta) supports all of its functionality.

To determine whether a specific functionality such as light estimation is supported on a specific provider, you can have your code check the value of the associated property in the XRCameraSubsystemDescriptor.

As for the issue with FrameReceived not being called, I would recommend taking a look at the script for the Basic Light Estimation scene in our samples project and comparing your code against that.

1 Like

Thanks for the clarification.

I checked the FrameReceived issue and it is also not called with this scene: https://github.com/Unity-Technologies/arfoundation-samples/tree/main/Assets/Scenes/Camera/LightEstimation. So it seems there is an issue here.

And for the last question: is there any ETA if and when these features will be available for meta quest 3 devices?

Thanks

Ah, it looks like you’re correct that frameReceived is not invoked on Meta. It seems that we chose not to invoke the frameReceived event on Meta because Meta does not give us access to the camera pixels (so there wouldn't be any useful data to work with). We don’t mention in our documentation anywhere that frameReceived is not invoked on some platforms, so I’ll look into this further to see if we can either clarify this in the documentation or find another solution that makes sense.

[quote]
And for the last question: is there any ETA if and when these features will be available for meta quest 3 devices?
[/quote]

Meta does not (currently) support light estimation, so it’s a platform limitation for us. We can’t implement it in AR Foundation until Meta implements it on their platform first. So I do not have an ETA for you.

3 Likes

Are there any news on the possibility to implement some sort of light estimation on Meta Quest 3 in order to keep lighting coherence between real world and virtual objects?

Answered:

https://discussions.unity.com/t/921144 page-4#post-9825381

1 Like

[quote=“andyb-unity”, post:11, topic: 935100]
Answered:

https://discussions.unity.com/t/921144 page-4#post-9825381
[/quote]
Thank you, I missed it. So it’s on OpenXR to still implement this feature? Does the Meta XR All in One SDK provide it? Or it’s an actual limitation of the device not allowing light estimation?

At Unity, we don't know what Meta's device may or may not be capable of. What we can say is that Meta does not currently expose an API for light estimation. Unity can't give you any data that we don't have access to ourselves.

1 Like