Using the MNIST Handwritten Digit Recognition model for object spawning

I am working on a project that involves using the handwriting detection model from the example in the documentaion for spawning objects within a game environment. My idea centers around scanning object textures for specific handwritten numbers, such as the number 7 (so basically drawing 7s on a wall textrure), and then placing objects at those identified locations within the texture.

Is it feasible to leverage the handwriting detection model for this purpose? If so, any guidance or insights on how to structure this implementation would be immensely appreciated.

Lots of ways to do this. Personally off the top of my head this is what I’d do:

For writing on the wall:
Use a ray cast to detect when the user starts to draw on the wall. Instantiate a new texture of the same size. Use the uv coordinates from the RayCastHit to draw into this new texture. (You might use a separate drawing library to do the actual drawing). Perhaps use a shader to combine these two textures together to show the writing on the wall.

For the digit recognition.
You need to crop the image and convert it to black and white 28x28 before passing it in the model.

Apparently there is a sample project coming out in September which is much related to this scenario. :thinking: