Hi, I want to create a standalone app which uses the front camera to track the users face and use it to animate a model (model has blendshapes). I don’t want two separate apps, where one captures and streams data to another. Here is an example video.
I want to do this on both Android and IOS. Let me know if this is possible using Unity for any or both OS.
I am open to use any existing commercial plugin/asset like OpenCV, DLib.
If this is not possible using Unity, kindly guide me on what tech I would need for this.
Thanks, I did manage to do it for IOS using Face Tracking with ARKIT (52 Blendshapes). Attempting to achieve the same quality for Android using ARCORE. Any tips would be appreciated.
Unfortunately given the higher variance of devices, it’s harder to get the same quality results for ARCore that you get out of ARKit for the same features. This is across the board, not just for face tracking.
ARCore doesn’t provide the same fidelity of face tracking data. As you indicated in your first post, you’ll likely need to supplement ARCore with some third-party face tracking library to try to achieve comparable face tracking quality compared to ARKit.
Unity employees generally can’t give you advice for third-party setups, but as you say, OpenCV is often a good place to start.