Agricultural Robot

Revision as of 15:58, 6 February 2021 by Skyben (talk | contribs) (Created page with "==Autonomous Agricultural Robot== We developed a complete autonomous package for outdoor exploration. The purpose: to autonomously take soil samples in crop fields and greenh...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Autonomous Agricultural Robot

We developed a complete autonomous package for outdoor exploration. The purpose: to autonomously take soil samples in crop fields and greenhouses. The data gathered reports soil temperature, soil moisture, pH levels, and carbon monoxide readings. This system is capable of navigating outdoor environments via waypoints while mapping its surroundings and avoiding obstacles.

The capabilities of this technology can cover a broad range of applications other than gathering soil data. Variants of this system will thrive in security and inspection scenarios also. Below we break down the control system hardware and software so you can get a better idea of what’s happening behind the scenes.

Autonomous Agriculture Robots.jpg


The Base Station

  • Long range IP radio
  • Emlid Reach RTK GNSS base module

This acts as the communication hub for one or more autonomous platforms. Operators connect to the station via Ethernet to command rovers, view gathered data, and receive status reports. This one in particular is outfitted with a long range tactical radio and a real-time kinematic global navigation satellite system (RTK GNSS) module. In outdoor applications, this module plays an important role in providing accurate localization. It’s configured as a base module to send correctional data to each robot to compensate for the inaccuracies of a standalone GNSS. The result is centimeter precision positioning which the robots use to plan routes and place itself in the world for the user to see.

Agriculture Base Station.jpg

Autonomous Robot(s)

  • Zed 3D stereo camera
  • 2 long range Hokuyo LIDARs
  • Emlid ReachRTK GNSS rover module
  • YEI 3-Space 9 DOF IMU
  • Wheel encoders
  • NVIDIA Jetson TK1 embedded Linux computer
  • Robot Operating System

Traveling from point A to point B may sound simple. However, there are many cases to consider when traveling autonomously outdoors. Making a mobile platform aware of its surroundings alone takes some serious computational power. The NVIDIA Jetson TK1 acts as the brains of this machine by fusing sensor data, calculating its position within an obstacle map, and planning paths based on the fastest and safest route given the operators waypoints. All of this was developed on the Robot Operating System (ROS) platform using both stock and custom nodes.