After watching the video, link below, and understanding the basics of AI Path Finding and Colliders I was wondering if Unity would be able to be used to build the first Unity Robot using Arduino Components and related scripts.
How the basis of the Unity Robot would function is as follows:
-
The Robot would be assembled using Arduino components.
-
The next step would be incorporate the Build Process into the Unity Engine - Redirecting Basically the Sketch created to operate the Robot would be first designed using Arduino where a bridge would be built between the scripting and sketching process of both systems where the bridge would read the information from both systems and convert it to other method much in the same way that a vein returns used blood cells to the lungs where oxygen is added to the blood cells that is then routed to the body’s systems through arteries. The ‘lungs’ of this system would essentially convert both data streams to the others coding where the end user would be operating the Robot using Unity Scripts that would then be converted into Arduino Sketches through the lungs to operate the Robot that after the data has been sent to the Robot the return data stream would then be converted back into Unity Scripting from Arduino Sketches.
-
The Unity architecture would then be able to be used to place colliders on certain components of the vehicle such as the wheels that when Unity Scripting is added to their functions such as the mass of the robot and the Robots velocity would generate a protocol where the Robot would automatically adjust its velocity to compensate for traveling over rough ground compared to smooth ground along with either decreasing or increasing its velocity to compensate for the pull and drag.
-
AI Pathfinding would be used to program the Robots initial direction of travel either forward or reverse with the ability of the end user to control the left and right direction of travel from their console much the same way that a truck model is able to be turned left or right and made to go reverse or forward using the assigned keys on a keyboard or by mouse clicks. The velocity of the Robot would also be controlled by the end user in the same manner with a graphical representation of the layout being built for the end user on their Unity
Screen using a basic pyramid shape. When building the environment on the Unity Screen cameras designed to specifically determine the height and distance of an object from a central location would be programmed to create objects on the Unity Screen that measured more than a 1/4" of the overall diameter of the wheels on the Robot that would direct the Robot to either avoid the object by driving around it, which would be fed the Sphere Collider Script so that the Robot’s AI could turn the wheels of the Robot to avoid the object based on the overall diameter of the Sphere Collider in time to keep the Robot from striking the object at max velocity that would
possibly cause the Robot to flip over or if the object was small enough, less than a 1/4" in height of the wheels compared to the mass of the Robot and the velocity needed to overcome the object, then the Robot would continue on its course driving over the object.
If the above scripting and sketching methods are able to be linked together using the Lung Bridge method then Unity would not only be a game development engine it could also become a engine used to create robots that the end user could control from their very own laptops or PC’s using WiFi technology.