Difference between revisions of "Unmanned Ground Vehicles"

(Created page with "Unmanned Ground Vehicles (UGVS) are ground-based robots that operate without any humans onboard. They can be remote-controlled or autonomous. == Remote-Controlled UGVs == =...")
 
(Line Following)
 
(6 intermediate revisions by 3 users not shown)
Line 1: Line 1:
 
Unmanned Ground Vehicles (UGVS) are ground-based robots that operate without any humans onboard. They can be remote-controlled or autonomous.
 
Unmanned Ground Vehicles (UGVS) are ground-based robots that operate without any humans onboard. They can be remote-controlled or autonomous.
  
== Remote-Controlled UGVs ==
+
==Remote-Controlled UGVs==
  
  
== Autonomous UGVs ==
+
==Autonomous UGVs==
  
UGVs can occupy a broad spectrum of [[discipline_support:autonomous:1_autonomous]] capability.
+
UGVs can occupy a broad spectrum of [[:Category:Autonomous|autonomous]] capability.
  
=== Open Loop/Sensorless Robots ===
+
===Open Loop/Sensorless Robots===
  
The most basic autonomy possible. The robot blindly repeats a set of pre-programmed motions
+
The most basic autonomy possible. The robot blindly repeats a set of pre-programmed motions without any sensor feedback. Without sensors, the UGV doesn’t know where it is and can’t detect obstacles around itself. UGVs using this approach aren’t very useful so open loop autonomy is not common for this class of robots. Industrial robot arms can be effective with open loop operation in applications where they just need to perform some repetitive motion over and over. However, this still requires lots of calibration/tuning and the environment must be maintained in the state that the robot is expecting.
without any [[discipline_support:electrical_and_controls:sensors:sensors|sensor]] feedback.
 
Without [[discipline_support:electrical_and_controls:sensors:sensors]], the UGV doesn’t know where it is and can’t detect obstacles around itself. UGVs using this approach aren’t very
 
useful so open loop autonomy is not common for this class of robots. Industrial robot arms can be effective with open loop operation in applications where they just need to perform some repetitive motion over and over. However, this still requires lots of calibration/tuning and the environment must be maintained in the state that the robot is expecting.
 
  
 
When we sell programmable robots without a remote control method, we often use an open loop
 
When we sell programmable robots without a remote control method, we often use an open loop
 
approach and program them to cycle through a set of preset movements, e.g. drive forward, drive backwards, turn left, turn right. The robots in the following video are examples of this.
 
approach and program them to cycle through a set of preset movements, e.g. drive forward, drive backwards, turn left, turn right. The robots in the following video are examples of this.
  
{{ youtube>Rt2Z3Sxtdn4?large }}
+
{{#evt:
<br>
+
service=youtube
 +
|id=https://www.youtube.com/watch?v=Rt2Z3Sxtdn4
 +
|alignment=center
 +
|dimensions=600
 +
}}
  
=== Line Following ===
+
===Line Following===
  
[[[discipline_support:autonomous:line_following|Line following]] is a method of [[discipline_support:autonomous:1_autonomous]] movement in which the UGV follows a line on the ground. Common approaches for detecting the line include optical sensors which detect the color of the line or magnetic sensors ([[MGS1600]]) which detect magnetic tape. The line layout can simply go from point A to B or scale up to a complex network of forks and waypoints. When manually driven, the robot may move freely without the line. This method is far easier and cheaper to develop than allowing the robot to freely roam in the space. The trade-offs are that the line/tape must be placed beforehand (and maintained) and the robot’s autonomous movement is restricted to the lines.
+
[[Line Following]] is a method of autonomous movement in which the UGV follows a line on the ground. Common approaches for detecting the line include optical sensors which detect the color of the line or magnetic sensors (<sdr item id=2295>MGS1600</sdr item>) which detect magnetic tape. The line layout can simply go from point A to B or scale up to a complex network of forks and waypoints. When manually driven, the robot may move freely without the line. This method is far easier and cheaper to develop than allowing the robot to freely roam in the space. The trade-offs are that the line/tape must be placed beforehand (and maintained) and the robot’s autonomous movement is restricted to the lines.
  
The robot in the video below is a line follower that uses the [[RoboteQ MGS1600GY Magnetic Guide Sensor]] to follow magnetic tape on the ground. The robot demonstrates stopping at waypoints and selecting routes at forks.
+
The robot in the video below is a line follower that uses the <sdr item id=2295> RoboteQ MGS1600GY Magnetic Guide Sensor</sdr item> to follow magnetic tape on the ground. The robot demonstrates stopping at waypoints and selecting routes at forks.
  
{{ youtube>rUxeWirkWc0?large }}
+
{{#evt:
<br>
+
service=youtube
 +
|id=https://www.youtube.com/watch?v=rUxeWirkWc0
 +
|alignment=center
 +
|dimensions=600
 +
}}
  
=== Robots that don't use Mapping/SLAM ===
+
===Robots that don't use Mapping/SLAM===
  
The UGV can move freely but doesn’t maintain an [[discipline_support:autonomous:obstacle_map|obstacle map]] of its environment. The UGV may implement an [[discipline_support:autonomous:obstacle_detection|obstacle detection]] system and can use this information to prevent collisions. The lack of a map allows the use of less powerful computers and sensors but prevents the robot from being able to reliably [[discipline_support:autonomous:navigation|navigate]] around obstacles or generate paths that avoid previously encountered obstacles. [[discipline_support:autonomous:navigation|Navigation]] decisions are made with only the current information visible to the robot's sensors.
+
The UGV can move freely but doesn’t maintain an obstacle map of its environment. The UGV may implement an [[Obstacle Detection|obstacle detection]] system and can use this information to prevent collisions. The lack of a map allows the use of less powerful computers and sensors but prevents the robot from being able to reliably navigate around obstacles or generate paths that avoid previously encountered obstacles. Navigation decisions are made with only the current information visible to the robot's sensors.
  
If the UGV has a [[discipline_support:autonomous:positioning_system|positioning system]], it can travel to waypoints. The video below shows our [[Mini-IPS Robot]] in action. The robot
+
If the UGV has a positioning system, it can travel to waypoints. The video below shows our <sdr item id=2880> Mini-IPS Robot</sdr item> in action. The robot uses the Marvelmind Indoor Positioning System and wheel encoders to position itself. The robot runs on ROS and the video shows it traveling between user-defined waypoints. The <sdr item id=2880> Mini-IPS Robot</sdr item> is also equipped with a 2D Lidar that it uses for obstacle_detection.
uses the [[discipline_support:electrical_and_controls:sensors:marvelmind_ips|Marvelmind Indoor Positioning System]] and [[discipline_support:electrical_and_controls:sensors:wheel_encoders|wheel encoders]] to position itself. The robot runs on [[discipline_support:software:ROS]] and the video shows it traveling between user-defined waypoints. The [[Mini-IPS Robot]] is also equipped with a [[discipline_support:electrical_and_controls:sensors:Lidar|2D Lidar]] that it uses for [[discipline_support:autonomous:obstacle_detection]].
 
  
{{ youtube>seut6SvUD2M?large }}
+
{{#evt:
<br>
+
service=youtube
 +
|id=https://www.youtube.com/watch?v=seut6SvUD2M
 +
|alignment=center
 +
|dimensions=600
 +
}}
  
=== Robots that use Mapping/SLAM ===
+
===Robots that use Mapping/SLAM===
  
The UGV moves freely while generating and maintaining a map of obstacles encountered in its
+
The UGV moves freely while generating and maintaining a map of obstacles encountered in its environment, usually with the SLAM algorithm. Use of mapping automatically equips the robot with powerful positioning and [[Obstacle Detection|obstacle detection]] systems. The [[Positioning System|positioning system]] enables real-time tracking of the robot's location and waypoint travel. The navigation system can use the map to plan a path to a waypoint that avoids any previously encountered obstacles. These features greatly enhance the capabilities of the robot.
environment, usually with the [[discipline_support:autonomous:SLAM|SLAM algorithm]]. Use of
 
[[discipline_support:autonomous:mapping]] automatically equips the robot with powerful [[discipline_support:autonomous:positioning_system|positioning]] and [[discipline_support:autonomous:obstacle_detection|obstacle detection]] systems. The [[discipline_support:autonomous:positioning_system|positioning system]] enables real-time tracking of the robot's location and
 
waypoint travel. The [[discipline_support:autonomous:navigation]] system can use the map to
 
plan a path to a waypoint that avoids any previously encountered obstacles. These features greatly enhance the capabilities of the robot.
 
  
The [[Autonomous Agricultural Robot]] in the video below uses [[discipline_support:electrical_and_controls:sensors:Lidar|2D Lidars]] for 2D [[discipline_support:autonomous:SLAM]] when
+
The <sdr item id=2420>Autonomous Agricultural Robot</sdr item> in the video below uses [[Lidar#2D Lidar|2D Lidars]] for 2D SLAM when indoors and a Zed depth camera for 3D SLAM when outdoors. The positioning system also fuses measurements from [[encoders]], an [[Inertial Measurement Unit|IMU]], and RTK [[GPS]]. The robot can travel to user-defined waypoints while avoiding both expected and unexpected obstacles along the way.
indoors and a Zed [[discipline_support:electrical_and_controls:sensors:depth_camera|depth camera]] for 3D [[discipline_support:autonomous:SLAM]] when outdoors. The [[discipline_support:autonomous:positioning_system|positioning system]] also [[discipline_support:software:data_filtering#sensor_fusion|fuses]] measurements from [[discipline_support:electrical_and_controls:sensors:wheel_encoders|encoders]], an [[discipline_support:electrical_and_controls:sensors:IMU]], and [[discipline_support:electrical_and_controls:sensors:GPS#RTK|RTK GPS]]. The robot can travel to user-defined waypoints while avoiding both expected and unexpected obstacles along the way.
 
  
{{ youtube>CKYJO-Ha80E?large }}
+
{{#evt:
<br>
+
service=youtube
 +
|id=https://www.youtube.com/watch?v=CKYJO-Ha80E
 +
|alignment=center
 +
|dimensions=600
 +
}}
 +
[[Category:Autonomous]]

Latest revision as of 13:46, 13 April 2021

Unmanned Ground Vehicles (UGVS) are ground-based robots that operate without any humans onboard. They can be remote-controlled or autonomous.

Remote-Controlled UGVs

Autonomous UGVs

UGVs can occupy a broad spectrum of autonomous capability.

Open Loop/Sensorless Robots

The most basic autonomy possible. The robot blindly repeats a set of pre-programmed motions without any sensor feedback. Without sensors, the UGV doesn’t know where it is and can’t detect obstacles around itself. UGVs using this approach aren’t very useful so open loop autonomy is not common for this class of robots. Industrial robot arms can be effective with open loop operation in applications where they just need to perform some repetitive motion over and over. However, this still requires lots of calibration/tuning and the environment must be maintained in the state that the robot is expecting.

When we sell programmable robots without a remote control method, we often use an open loop approach and program them to cycle through a set of preset movements, e.g. drive forward, drive backwards, turn left, turn right. The robots in the following video are examples of this.

Line Following

Line Following is a method of autonomous movement in which the UGV follows a line on the ground. Common approaches for detecting the line include optical sensors which detect the color of the line or magnetic sensors (MGS1600) which detect magnetic tape. The line layout can simply go from point A to B or scale up to a complex network of forks and waypoints. When manually driven, the robot may move freely without the line. This method is far easier and cheaper to develop than allowing the robot to freely roam in the space. The trade-offs are that the line/tape must be placed beforehand (and maintained) and the robot’s autonomous movement is restricted to the lines.

The robot in the video below is a line follower that uses the RoboteQ MGS1600GY Magnetic Guide Sensor to follow magnetic tape on the ground. The robot demonstrates stopping at waypoints and selecting routes at forks.

Robots that don't use Mapping/SLAM

The UGV can move freely but doesn’t maintain an obstacle map of its environment. The UGV may implement an obstacle detection system and can use this information to prevent collisions. The lack of a map allows the use of less powerful computers and sensors but prevents the robot from being able to reliably navigate around obstacles or generate paths that avoid previously encountered obstacles. Navigation decisions are made with only the current information visible to the robot's sensors.

If the UGV has a positioning system, it can travel to waypoints. The video below shows our Mini-IPS Robot in action. The robot uses the Marvelmind Indoor Positioning System and wheel encoders to position itself. The robot runs on ROS and the video shows it traveling between user-defined waypoints. The Mini-IPS Robot is also equipped with a 2D Lidar that it uses for obstacle_detection.

Robots that use Mapping/SLAM

The UGV moves freely while generating and maintaining a map of obstacles encountered in its environment, usually with the SLAM algorithm. Use of mapping automatically equips the robot with powerful positioning and obstacle detection systems. The positioning system enables real-time tracking of the robot's location and waypoint travel. The navigation system can use the map to plan a path to a waypoint that avoids any previously encountered obstacles. These features greatly enhance the capabilities of the robot.

The Autonomous Agricultural Robot in the video below uses 2D Lidars for 2D SLAM when indoors and a Zed depth camera for 3D SLAM when outdoors. The positioning system also fuses measurements from encoders, an IMU, and RTK GPS. The robot can travel to user-defined waypoints while avoiding both expected and unexpected obstacles along the way.