Im curious if sentis can be or will be able to be “trained” on data i feed it where a player can type keywords to it and it will respond appropriately. I would like to be able to teach the model by typing to it and eventually be able to use the in game “lore terminal” offline where the user’s unity runtime instance may not have an internet connection.
Im also curious if i could create methods like where a player may type “open the pod bay door” and the AI model might type back “which one?” naturally.
Think of MUTHR from the Alien film. Just a spitball question and i think the answer will be yes at some point but i’m wondering if you can even go about something right now with the 2023 beta version or whatever the latest sentis is on.
Training in Editor is currently not available, but you can train the desired model by using something like PyTorch. However, we don’t yet have a tokenizer, but I’ve seen some C# tokenizer projects in GitHub. Besides that. I think it should be possible.
Can you name any names for the github projects that work with Unity? I’ve been looking and can’t find anything except the HuggingFace one that requires internet connection.
We put out our sample tutorials on GitHub.
Sometimes they are accompanied with a video tutorial
If you have any tutorial you’d want covered please let us know!
We have a few new examples on Hugging Face now. For your purposes you might like to try MiniLM which compares sentences to see how similar they are. So for example you can compare the user’s sentence to “Open the pod bay doors” and if it reaches a certain threshold you can reply with “Which one?” . For more in depth conversations you’d probably want to use an LLM. We have a small one called tiny stories. It is mostly trained on stories though. So you could feed it a line such as “The computer opened the pod bay doors” and see what happens.