Skip to main content

Under canopy light detection and ranging‐based autonomous navigation


This paper describes a light detection and ranging (LiDAR)‐based autonomous navigation system for an ultralightweight ground robot in agricultural fields. The system is designed for reliable navigation under cluttered canopies using only a 2D Hokuyo UTM‐30LX LiDAR sensor as the single source for perception. Its purpose is to ensure that the robot can navigate through rows of crops without damaging the plants in narrow row‐based and high‐leaf‐cover semistructured crop plantations, such as corn (Zea mays) and sorghum (Sorghum bicolor). The key contribution of our work is a LiDAR‐based navigation algorithm capable of rejecting outlying measurements in the point cloud due to plants in adjacent rows, low‐hanging leaf cover or weeds. The algorithm addresses this challenge using a set of heuristics that are designed to filter out outlying measurements in a computationally efficient manner, and linear least squares are applied to estimate within‐row distance using the filtered data. Moreover, a crucial step is the estimate validation, which is achieved through a heuristic that grades and validates the fitted row‐lines based on current and previous information. The proposed LiDAR‐based perception subsystem has been extensively tested in production/breeding corn and sorghum fields. In such variety of highly cluttered real field environments, the robot logged more than 6 km of autonomous run in straight rows. These results demonstrate highly promising advances to LiDAR‐based navigation in realistic field environments for small under‐canopy robots.


Go to original publication Download PDF
Privacy Policy | Contact Us