2Mechanical Engineering, Purdue University, West
Lafayette, USA
ORCiD: 0000-0003-4060-6499
Keywords: Phenotyping, Hyperspectral Imaging, Proximal Sensing,
Robotics, Drone, Machine Vision, Soybean, Robotic Arm.
In soybean phenotyping, hyperspectral imaging via proximal sensing has
higher signal-over-noise and resolution than remote sensing. However, it
has not been adopted for large-scale field applications due to its low
throughput and high labor costs. Additionally, no automation solution
has been developed to collect in vivo proximal hyperspectral images of
dicot plants. In this study, a novel drone-based robotic system was
developed to automate the collection of in vivo proximal hyperspectral
images in the field. The system consists of a machine vision system to
detect and estimate the pose of soybean leaflets, an articulated robotic
arm with specialized control and path planning algorithms to operate
proximal sensors and grasp the leaflets, and a heavy-duty customized
drone to provide mobility in the field. For each sampling location, the
system flies to the location, lands on top of the canopy, approaches the
plant with the proximal sensor, and collects data. An experiment was
conducted in October 2022 at the Agronomy Center of Research and
Education at Purdue University. The designed system collected 90 samples
with a centimeter-level landing accuracy, a detection accuracy of over
75%, and an operation success rate of over 85% (preliminary results).
The system provides a novel approach to adopting in vivo proximal
hyperspectral imaging on a large scale. The results of this study
demonstrate the potential to identify nutrient deficiencies, diseases,
and chemical damage in the field earlier to prevent yield loss, improve
food quality, and accelerate the development cycle of agricultural
products.