Basically, I can generate the basic navmesh since the level surface is always known in the editor; but if the user decide to put a wall for example; then how does the navmesh AI behave?
I heard that the navmesh generation at runtime has been planned for release sometime this year, but in the meantime, how do you handle cases where the user may add geometry and obstacles to a level for example, and have the AI agent to recognize this and act accordingly?
I suspect that if I add an element at runtime, like a room, with colliders on the walls, this won’t allow the agent to pass through the walls, but I suspect it won’t have any sort of context about the fact that there may be (or may be not) a door to get in the room.
I believe you need to place navmesh obstacles on any object that you want the agent to walk around (like wall tables trees buildings Etc…) as I’m pretty sure they’ll walk right through collider’s.
So an example would be if you were to make a run-time built City it would have to be on a flat plane with set dimensions ,you would build your navigation on that Base plane and then put obstacles on all your buildings trees fences of your prefabs to be placed .That would all work and with a run time type build.
For large obstacles you need to turn on carving or the agent will resolve a path through an read that might not be valid.
Carving only remove navmesh, if the player adds stairs that won’t work. For that you need to forget unity navmesh and use 3rd party like Aphex and Aaron’s astar