ARKit Remote now supports Face Tracking!

Added support into Bitbucket, Asset Store update soon.
Blog post with full details also coming soon, but in the meantime here’s a taste: x.com

Neat - wish this was released about a week and a half ago, before I basically built my own to test out avatar puppeteering with 2D cartoon avatar faces. It’s not a live stream, just record and save face data animation - I ended up creating my own format to record blendshapes for later playback.

Has runtime support for reading/decoding! GitHub - faced-io/FaceAvataaars: Puppeteer Avataaars with your iPhone X (and also test cartoons using BlendShapeRecorder data)

1 Like

Niiice!

I also ended up writing my own face mesh streaming - last minute for Neon Contest.

ran out of time x.com