Down on the farm with C-3PO
Researchers at the University of Illinois have come up with a robot that can methodically wander fields and beam back stunningly perceptive crop reports.
Weaving its way through a farm on a track system, the super-perceptive farmhand grabs detailed information on each and every plant it passes.
All told, the device gathers enough information on each plant—including stem diameter, plant height, and leaf size—to generate a three-dimensional (3D) image of every plant it passes.
The result: the farmer gets a lush visualization of what his crop looks like, as well as a treasure trove of insight on those plants that are faring best.
The university's goal is to bring the agricultural robot to market within three years, according to Girish Chowdhary, an assistant professor in the Department of Agricultural and Biological Engineering at the University of Illinois at Urbana Champaign, a lead researcher on the project.
"We are targeting a cost to the breeder of $5,000 to $10,000, which means we will have to get the manufacturing cost significantly below that," Chowdhary says. "An agricultural robot that costs just $5,000 is a totally new concept."
Plant breeders are expected to among the earliest adopters of the device, given that the robot enables them to continuously monitor the growth of new plant breeds they develop, and enables them to much more quickly discern which in-development plant breeds are worthy of further refinement.
"One of the big advances of the last few years is that we can now determine the complete DNA blueprint of each plant," says Stephen P. Long, project director and Gutgsell Endowed Professor in the departments of Plant Biology and Crop Sciences at the University of Illinois. "But how do we use that?
"What we need is to be able to describe a plant as it grows. You could do that perhaps with an army of people, but now the robot can do all of that for you. For producers, it's going to accelerate the rate at which we can improve the genetic material."
Vijaya Gopal Kakani, an associate professor of crops, energy, and climate in the Department of Plant and Soil Sciences at Oklahoma State University, describes Chowdhary's robot as an inexpensive way to "significantly enhance a breeder's ability to select genotypes," for quick development.
Adds Joe Cornelius, a program director for the Advanced Research Projects Agency–Energy (ARPA-E) of the U.S. Department of Energy, "Traditionally, crop breeders measure crop height with a meter stick. The robotic sensor and computational technologies under development by Girish's team make breeding several orders of magnitude faster, more accurate, and insightful."
ARPA-E's Transportation Energy Resources from Renewable Agriculture program funded Chowdhary's project with a $3.1-million grant.
For sensors on the robot, Chowdhary is using Resonon hyperspectral imagers, along with a forward-looking infra-red (FLIR) T1030sc thermal imager and a Canon EOS 7D Mark II camera, to bring home all the data needed to generate the 3D imagery. Meanwhile, GPS navigation is helped along with a Septentrio Altus APS-NR2 smart antenna, and a Pmod GYRO three-axis digital gyroscope powered by a ST L3G4200D low-power three-axis angular rate sensor by Diligent USA.
"If the GPS signal is partially and completely blocked by plants, LiDAR sensors can detect the edges of rows so that farming equipment can continue moving until GPS signal can be reestablished," Chowdhary says.
Interestingly, computer hardware needed for the robot is relatively minimal: just the Raspberry Pi and Intel Next Unit of Computing (Nuc) single-board computers. Software to process all the data and render images of the crops in 3D—under development at software firm Signetron, based in Berkeley, CA—is being written on the Linux kernel using software development kits provided by the sensor manufacturers.
Plus, Chowdhary is getting help adding genetic analytics to the robot from researchers at Cornell University.
Similar robotics development is underway at the University of Missouri, where researchers are working on a sensor-laden robotic vehicle—coupled with mobile sensor tower—to collect data on plant growth. Guilherme DeSouza, an associate professor of electrical engineering and computer science at the University of Missouri, is experimenting with the system to detect crop stress brought on by heat, drought, and other environmental factors.
While crop surveying might seem more effortlessly accomplished with aerial robotic drones, Chowdhary says ground-based robotics work better for this particular application, given that the sensors need to capture data under the crop canopy, which is inaccessible to aerial drones.
"Drones also cannot carry large payloads such as heavy hyperspectral cameras and LIDAR sensors, like our robot," Chowdhary says.
In the long term, Chowdhary and his team are also interested in customizing their agricultural robot for growers that require extremely detailed, 3D crop visualizations.
The technology "can benefit growers looking to identify weak spots in the field for management decisions," says Oklahoma State's Kakani. "In addition to row crops, the robot would have significant use in high-value crops such as fruit and nut orchards and vineyards."
Adds Cornelius, "We are witnessing the convergence of biology, engineering and computational science that transforms agriculture and enables a second green revolution."