With the new Visualization package, our existing ROS connection package can now display visualizations of most of the common ROS message types. Use the predefined visualizers or write your own: draw 3d lines, shapes, labels, point clouds, and meshes, or 2d GUI elements such as text and images, however you see fit.
Sure! If your lidar is sending ROS LaserScan messages it should Just Work™.
If you’re not using ROS, you can write your own code to instantiate a PointCloudDrawing yourself, and draw points on it manually - just place the Drawing3dManager prefab in your scene and write something like:
For efficiency, rather than making a new PointCloudDrawing each time, you should keep a reference to the same one, and call Clear() before you redraw it.
I want to map the inputs from keyboard to Oculus Quest controller for niryo_one robot. Has anyone worked on a similar project? I need some assistance with mapping the controls.
I was hoping I could slap the DefaultVisualizationSuite from the Visualizations package into the Pick and Place example with the Niryo One robot and get some basic trajectory visualization out of the box. But all I get are errors. Should that work, or are there any modifications I need to do in order to use the DefaultVisualizationSuite in Pick and Place?