Individual maize extraction from UAS imagery-based point clouds by 3D deep learning

04/16/2021
  • Herrero-Huerta, M., Tolley, S., Tuinstra, M. R., and Yang, Y. (2021). 'Individual maize extraction from UAS imagery-based point clouds by 3D deep learning'. In Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI (Vol. 11747, p. 1174704). International Society for Optics and Photonics.

Automated and cost-effective phenotyping pipelines are needed to efficiently characterize new lines and hybrids developed in plant breeding programs.

In this study, we employ deep neural networks (DNNs) to model individual maize plants using 3D point cloud data derived from unmanned aerial systems (UAS) imagery by PointNet network. The experimental setup was performed at the Indiana Corn and Soybean Innovation Center at the Agronomy Center for Research and Education (ACRE) in West Lafayette, Indiana, USA. On June 17th, 2020 a flight was carried out over maize trials using a custom designed UAS platform with a Sony Alpha ILCE-7R photogrammetric sensor. RGB images were processed by a standard photogrammetric pipeline by Structure from Motion (SfM) to reconstruct the study field into a final scaled 3D point cloud. 50 individual maize plants were manually segmented from the point cloud to train the DNN and subsequently individual plants were extracted over a test trial with more than 5,000 plants. Moreover, to reduce overfitting in the fully-connected layers, we employed data augmentation not only in translation, but also in color intensity. Results show a successful rate for the extraction of the individual plants of 72.4%.

Our test trial demonstrates the possibility of using deep learning to overcome the individual maize extraction challenge on the basis of UAS data.

Monica Herrero-Huerta, Unversity of Padova (IT)
Powered by Webnode
Create your website for free! This website was made with Webnode. Create your own for free today! Get started