Face Expression Action for eye open/close not working

Hi there

As the title explains, I am unable to get either of the eye’s Close or Open events to trigger when running on an Android device.

It works fine when simulated in the editor’s Device View, but not after building to the device.

The smile engaged/disengaged Expression Actions work however.

My project can be located here:
https://github.com/Petrie65/unity-mars-build-fail
It is an empty project with MARS that swaps out game objects on the eyes/month Landmarks based on Face Expression Actions.

Any advice would be appreciated. Thanks

hello @Stygian65 ,

Sometimes face expression actions will need some tweaking to make them work. You can do that by modifying settings in the ARCoreFacialExpressionSettings asset.

To do that, you will need to move that package to the assets folder; so try this:

  • Navigate to “Packages/Unity MARS AR Foundation Providers” in the Project window
  • In the Project window, right click → “Show in Explorer” to bring up the location of the package
  • Move the package contents (“com.unity.mars-ar-foundation-providers...”) to the “Packages” folder of the project
  • Return to the Unity editor
  • Navigate to the asset “ARCoreFacialExpressionSettings” and modify the expression parameters.

After doing this the asset should be editable for you to modify

1 Like

Hi @jmunozarUTech

Thanks for getting back to me. I have done as you suggested. I also had to remove mars-ar-foundation-providers from the package manager as there were now duplicate scripts. Unfortunately after building these changes, it seems that MARS is not working at all on the device.

I made a script listen to MARSFaceManager FaceUpdated events to see what values MARS has for the facial expressions. I can manipulate all the values with my face on-device, except for the eyes - they are frozen on 0. But when I run it in the Editor then it is clearly detecting the values.

Hello @Stygian65 ,

If you moved (not copied) the package, this shouldnt affect at all any behavior in the project.
you shouldn’t have duplicated folders; can you try embedding the package with this then?

This will embeds the package in your project, so you can change the asset I mentioned

Ok, thank you I was able to move the package so that I could modify the ARCoreFacialExpressionSettings and successfully build it to device.

I have tried a bunch of different configurations, but I am still unable to get the LeftEyeClose or RightEyeClose to reflect anything other than 0.

Something else worth mentioning; in order to get the Android build to succeed, I had to remove the Unity Content Manager and Unity MARS Companion Core preview packages from the project. I don’t think this should affect the core functionality though (for reference IL2CPP error - Failed to resolve assembly: 'netstandard, Version=2.0.0.0...' )

Hi @jmunozarUTech

I’ve been trying to get this working, but unfortunately made no progress. Do you perhaps have some advice on how I could further debug this?

I am really baffled by what could cause this. Especially since the rest of the expressions are working. I’d imagine the method used for detecting eyes closed/open would use on the same tech stack that allows recognition of the rest of the expressions.

Do you maybe know if other MARS users have successfully built this to an Android device? I would think so because it seems like a pretty common use case.

On a related note, I also tried deploying to an iPhone 6S Plus and could only see the rear-facing camera with no face-tracking functionality. I believe that is because it doesn’t have a forward-facing depth camera, so it would only work on an iPhone X or newer, is that right?

Hello @Stygian65 ,

We are looking into this, seems this is not a configure-related issue.

On the other hand; with the iOS face tracking, it should just work; ~it doesn’t~ DOES rely on depth camera for face tracking. BUT it will not work with the rear camera, so you will need the front camera to use facetracking

Thank you for looking into it.

Today we deployed to iPhone X and it worked, including the eye expressions. The same build is still not working for the iPhone 6S Plus though, I suppose some older-generation devices might not be able to run it.

We also tried 2 different Android platforms - so in total, we have tried a device from Huawei, Samsung, and Xiaomi. All with the same issue.

For more context, the relevant versions we are using are the following:

  • Unity 2020.3.6f1 (LTF)
  • AR Foundation, AR Subsystems, ARCore, ARKit, AR Face Tracking - 4.1.7
  • Mars, Mars AR Foundation Providers, Mars Nav Mesh - 1.3.1

Hi @Stygian65 , thank you for bringing this issue to our attention. We have looked into it and confirmed that the eye close expressions are not supported on Android. We are working on both fixing this issue and exposing the expression settings in our 1.4 release.

1 Like

Thanks @amydigiov_Unity

I’ll be looking forward to the 1.4 release.

May I know the roadmap when 1.4 will be release to fix on the captioned issue? Thank you!

Hello,
I’ve updated to 1.4.1 and can confirm that files such as “ARCoreFacialExpressionSettings” are in a more accessible location to modify and that eyes closed detection is “working” for Android.
It is finicky, however. If the user’s device is tilted back or forward, it alters the values in which it thinks the eyes are closed. Currently looking into using the device orientation to offset this value.

If anyone who finds this thread is wondering how i’m finding the eye’s closed float value, here’s some code to get you started:

public class FacialFeaturesReader : MonoBehaviour, IUsesFaceTracking, IUsesCameraOffset
    {
        [SerializeField] private TextMeshProUGUI dbgEyesText = null;

        IProvidesFaceTracking IFunctionalitySubscriber<IProvidesFaceTracking>.provider { get; set; }
        IProvidesCameraOffset IFunctionalitySubscriber<IProvidesCameraOffset>.provider { get; set; }

        private void OnEnable()
        {
            this.SubscribeFaceAdded(OnFaceAdded);
            this.SubscribeFaceUpdated(OnFaceUpdated);
            this.SubscribeFaceRemoved(OnFaceRemoved);
        }

        private void OnDisable()
        {
            this.UnsubscribeFaceAdded(OnFaceAdded);
            this.UnsubscribeFaceUpdated(OnFaceUpdated);
            this.UnsubscribeFaceRemoved(OnFaceRemoved);
        }

        private void OnFaceAdded(IMRFace face)
        {
            Debug.Log(face.Expressions.Count + " expression found for facial tracker");
        }

        void OnFaceUpdated(IMRFace face)
        {
             if (dbgEyesText != null)
             {
                 dbgEyesText.text = $"Left Eye: {face.Expressions[MRFaceExpression.LeftEyeClose]}" +
                     $"\n\nRight Eye: {face.Expressions[MRFaceExpression.RightEyeClose]}";
             }
        }

        private void OnFaceRemoved(IMRFace face)
        {

        }
    }
1 Like