Thought some people here might find this interesting:
Earlier this week I participated in the Global XR Hack (Cologne) and my team and I built a prototype for a Mixed Reality language learning app. To see the full video of it in action check one of the links below.
We’ve implemented Handwritten Text Recognition with Sentis using a model from the deep-text-recognition-benchmark repo. Object detection was done using Metas Scene Understanding API. For input, the app uses a stylus, the new Logitech MX Ink for Meta Quest devices.
It’s a simple app concept, but we did win 1st place in the “Best Use of MR Stylus” category ![]()
Here is a preview of only the text recognition part:
