RaptorAI Multi-Purpose AI Engine - States, Behaviour trees, Senses, Navigators, Waypoint Systems +++

Tested on Unity 5.5, 2017 and 2018

RaptorAI Multi-Purpose AI (RAMPA), also known as RaptorAI 2, is an artificial intelligence engine that simplifies the process of creating AI for your game. The system is built with flexibility and customizability in mind and can therefore be used for a variety of different purposes. It is centred around AI agents that use states, behaviour trees or a mixture of both along with navigators and senses (sensors) in order to create a complete AI experience. This is what defines a multi-purpose AI engine.
You can check out the asset store page here: Multi-Purpose AI Engine | Behavior AI | Unity Asset Store

Included in the package:

Finite State Machine that uses states (C# scripts) to control AI behaviour.

Behaviour Tree System for visual scripting using behaviour trees to control AI behaviour. Easily debug behaviour trees by replaying ticks and using the Debug node.

2 Navigators: Navmesh Navigator for using unity’s built-in navmesh system and Astar Navigator for using Aron Granberg’s A* Pathfinding Pro. Switching between navigators require no changes in existing states or behaviour trees. Simply change navigator with the click of a button!

4 Senses (sensors): Visual Sense for detecting Visual Objects with line-of-sight functionality, Audio Sense for detecting sounds like footsteps and Scent Sense for detecting “smells” if you are making animal AI or something similar. In addition to this, there is a Fast Camera sense that’s faster than visual sense for detecting objects (with line-of-sight as well).

Simple or Advanced Waypoint systems that can be used for anything from simple patrol routes to complex maze systems. Caching is possible for advanced systems to avoid expensive re-calculations.

Animation Profiles for mecanim that sets various mecanim variable values to support AI animation.

Auto Slope Module that automatically rotates agent to match terrain slope. Supports both terrain data and raycast mode for calculating normal.

AI Memory and AI Database for storing values for easy retrieval. These values can be inspected during runtime and traced to find the script that added the value, which results in extensive debugging capabilities.

Extremely flexible as you can create your own navigators, senses / objects, behaviour tree nodes, modules and presets!

Platform independent: Not tested on other platforms than Windows, but the engine is built upon Unity’s engine and pure C# code, so it should support all platforms unity supports. Contact me if it doesn’t.

Presets that contain functions that let’s you do calculations and speed up development.

AI Tags that can be used as an alternative to the unity tag system.

Full source code included along with summaries, tooltips and several example scenes. Full documentation is of course also provided.

Basic Group System that lets you define groups and attitudes towards other groups. Currently only limited support for this.

Screenshots and videos will be found at the asset store page. Contact me if you have any questions.

Official website: http://flamingraptor.com
Old RaptorAI 1 thread can be found here: RaptorAI - Powerful AI Scripting Motor for Programmers (states, advanced waypoint systems, senses++)

Is there a evaluation version for this asset?

No, there is not an evaluation version available. Let me know if you have any questions or doubts that you want me to clarify.

Are there some examples how to build my own behavior nodes and navigators, since I need some flying and swimming NPCs. Can I use coroutines in that stuff?
Is your Visual Sense camera mode 3d or flat?

1 Like

Hi Franky!
Not long ago I updated the documentation and it now includes how to create your own navigators / behaviour nodes. You can check out the documentation at the website (refer to chapter 4 - extending the engine). The documentation is designed for 2.01, ̶w̶h̶i̶c̶h̶ ̶i̶s̶ ̶a̶n̶ ̶u̶p̶c̶o̶m̶i̶n̶g̶ ̶u̶p̶d̶a̶t̶e̶ ̶(̶w̶a̶i̶t̶i̶n̶g̶ ̶f̶o̶r̶ ̶a̶p̶p̶r̶o̶v̶a̶l̶ ̶b̶y̶ ̶u̶n̶i̶t̶y̶)̶,̶ ̶s̶o̶ ̶n̶o̶t̶ ̶a̶l̶l̶ ̶p̶a̶r̶t̶s̶ ̶o̶f̶ ̶i̶t̶ ̶a̶p̶p̶l̶i̶e̶s̶ ̶t̶o̶ ̶2̶.̶0̶ ̶(̶c̶u̶r̶r̶e̶n̶t̶ ̶v̶e̶r̶s̶i̶o̶n̶)̶. Update: v2.01 has now been released and is available for download.
Full source code is also provided, so you can also check out the navigators / nodes that comes with the package.
Coroutines are not supported as the engine uses it’s own update system.
The Visual Sense camera is 3D. It basically creates a collider based on a normal unity collider, so it will detect anything within what you can see in that particular camera. Line-of-sight is of course also included.

Let me know if you have any further questions.

Hi Chris,
bought your asset, where do I have to put a custom navigator script so that it appears in the controllers navigator list?

Hi Franky.
Thank you for showing interest in my asset.

You can put it wherever you want it, as long as you remember to update the navigator cache (Other tab → Scan for scripts). It scans the entire assets folder, so it doesn’t matter where you put it.

  • Ruben

Hi Ruben,
can I use Raptor only with A* Pathfinding Pro or with the free version as well?

The Astar Navigator works with both the free and pro version.

Hi Ruben,

how can I access the AI Memory from a sense?

Hi Franky,

If you are creating your own custom sense, there is currently no variable you can access to get access to the AI Controller. If you need access to the memory from your custom sense, you can use “GetComponentInParent().aiMemory”.
I will add an “aiController” variable that can be used within senses / objects in an upcoming update.

In addition to this, creating custom senses and objects will be included in the upcoming documentation update as well.

I’m making AI for a shark where I can’t use navmesh. So I’ve written my own navigator which is moving a RigidBody in the top object. If I add then your VisualSense to my shark and use the camera mode, the pivo goes somewhere in in front and the movement gets confused. So I wrote my own visual sense.
There should be a way to change the navigators speed.

Hi Franky,
Could you please share some screenshots of the AI Controller (navigators and components tab), structure in the hierarchy and the actual scene view?
If you could also provide the code you used, that would be helpful.
I will help you fix your problem to the best of my ability. As far as I know, there shouldn’t be a need to create your own visual sense.

Did it already and it works like a charm!

The problem was that my navigator controls a rigidbody in the root object:
3707674--306268--upload_2018-9-22_14-38-29.png

When I add you visual sense everything works fine as long I dont use the camera (colliders are no problem), the pivot stays in then center of my shark:
3707674--306271--upload_2018-9-22_14-42-24.png

But when I switch to camera mode, the pivot moves in front of my shark:
3707674--306274--upload_2018-9-22_14-45-53.png
Due to the longer leverage the shark is rotated up/down which drives my collision detection crazy.
But no worry, with my own sens solution it works well.

Thanks for your support, BTW very nice well designed asset:)

1 Like

I see what your problem was now. I’m curious though, did the problem happen when rotating it through a script or through the editor? I tried replicating your problem and I see that it changes the pivot as long as you have the pivot mode set to ‘center’ (see picture below).

Rotating the agent using this mode results in a rotation around the ‘center of mass’ and that is (obviously) not what you want. However, if you change the pivot mode to ‘pivot’, the pivot is back at the original place again (see picture below).

After doing that, the rotation works fine in the editor.

That being said, when I tried rotating the agent using a script, it used ‘pivot’ mode for rotation even though my editor mode was set to ‘center’. So, unless you tried rotating it through the editor, this shouldn’t be a problem.
Anyway, I’m glad you found a solution for it.

And by the way, I appreciate that you like my asset :slight_smile:

I found one issue, when you changing things like states or navigators, the old ones stay often in the AI Settings folder. I had one time the problem that my navigator stopped working and even Unity starts hanging without any changes in my code. The final reason was that I had added [RequireComponent(typeof(Rigidbody))] to my navigator and your asset added then an other RigidBody to the AI Settings folder. Maybe you should take a look on this.

Hi Ruben,

could you add an int value to the animation profiles for condition and actions, that would it make possible to run animations per index (eg. Substates with multiple idles or attacks)

I would not recommend using the RequireComponent attribute on your navigator as it adds the rigidbody to the AI Settings gameobject like you said. Right now, there is a built-in function that you can overwrite in senses / objects that replaces Unity’s “RequireComponent” attribute. This is not supported for navigators at this point, but I can make it supported in the next update. This would mean that it automatically adds your rigidbody to the top parent instead of the AI Settings object.

When you switch navigators / states, they are not automatically deleted from the AI Settings gameobject. You can delete all unused navigators / states by clicking the “Clear AI Settings for unused states” button in the Other tab. The reason this doesn’t happen automatically is because if you want to switch back again, you do not need to change all the values again. If they are deleted, all values are reset. In an upcoming update, I can add the ability to change whether or not this happens automatically or not and let you decide which option suits you best.

Yes, I can add a int value to the animation profiles.

Hi Ruben,
could you add a function to change the navigator speed in the next version and it would be nice if one could use an other action set from the same profile as fallback action.

Do you mean another function for navigators, so that you can call “MyNavigator.ChangeSpeed” or something similar?

Sure, I’ll see what I can do.