Paving The Way To A New Future Of Unity

After watching the video, link below, and understanding the basics of AI Path Finding and Colliders I was wondering if Unity would be able to be used to build the first Unity Robot using Arduino Components and related scripts.

How the basis of the Unity Robot would function is as follows:

  1. The Robot would be assembled using Arduino components.

  2. The next step would be incorporate the Build Process into the Unity Engine - Redirecting Basically the Sketch created to operate the Robot would be first designed using Arduino where a bridge would be built between the scripting and sketching process of both systems where the bridge would read the information from both systems and convert it to other method much in the same way that a vein returns used blood cells to the lungs where oxygen is added to the blood cells that is then routed to the body’s systems through arteries. The ‘lungs’ of this system would essentially convert both data streams to the others coding where the end user would be operating the Robot using Unity Scripts that would then be converted into Arduino Sketches through the lungs to operate the Robot that after the data has been sent to the Robot the return data stream would then be converted back into Unity Scripting from Arduino Sketches.

  3. The Unity architecture would then be able to be used to place colliders on certain components of the vehicle such as the wheels that when Unity Scripting is added to their functions such as the mass of the robot and the Robots velocity would generate a protocol where the Robot would automatically adjust its velocity to compensate for traveling over rough ground compared to smooth ground along with either decreasing or increasing its velocity to compensate for the pull and drag.

  4. AI Pathfinding would be used to program the Robots initial direction of travel either forward or reverse with the ability of the end user to control the left and right direction of travel from their console much the same way that a truck model is able to be turned left or right and made to go reverse or forward using the assigned keys on a keyboard or by mouse clicks. The velocity of the Robot would also be controlled by the end user in the same manner with a graphical representation of the layout being built for the end user on their Unity
    Screen using a basic pyramid shape. When building the environment on the Unity Screen cameras designed to specifically determine the height and distance of an object from a central location would be programmed to create objects on the Unity Screen that measured more than a 1/4" of the overall diameter of the wheels on the Robot that would direct the Robot to either avoid the object by driving around it, which would be fed the Sphere Collider Script so that the Robot’s AI could turn the wheels of the Robot to avoid the object based on the overall diameter of the Sphere Collider in time to keep the Robot from striking the object at max velocity that would
    possibly cause the Robot to flip over or if the object was small enough, less than a 1/4" in height of the wheels compared to the mass of the Robot and the velocity needed to overcome the object, then the Robot would continue on its course driving over the object.

If the above scripting and sketching methods are able to be linked together using the Lung Bridge method then Unity would not only be a game development engine it could also become a engine used to create robots that the end user could control from their very own laptops or PC’s using WiFi technology.

I think it would be a fun and interesting project, certainly a fun exercise.

But from a practical standpoint, adding Unity into the mix isn’t really necessary, if the goal is just pathfinding. You can already do that on the Arduino (and other platforms) without the having to send data to an external computer. Unity wouldn’t add any value to the process, it would just add an extra step. Usually the goal of adding AI to a robot is to have it self contained/autonomous. Controlling robots from laptops or PCs isn’t the future, its the past. :wink:

If you are processing this type of information externally, there are many, many existing solutions/programs that would be much better choices than Unity. This type of stuff has been around for a while. In fact, the most common pathfinding algorithms that are often used in games, are based on ones developed about 40 years ago for robots in the first place.

That being said, there is still fun to be had mashing up the tech. A more unique (and appropriate) use for mixing unity in, would be to create a visualization using Unity to generate a 3d representation of a room or space. Like an lager scale AR app/game. Maybe have a robot that send the mapping data/video it generates to back to a Unity game and do some motion tracking to add virtual elements to a real space.