Dépôt numérique
RECHERCHER

An Automated Framework for Plant Detection Based on Deep Simulated Learning from Drone Imagery.

Téléchargements

Téléchargements par mois depuis la dernière année

Plus de statistiques...

Hosseiny, Benyamin; Rastiveis, Heidar et Homayouni, Saeid ORCID logoORCID: https://orcid.org/0000-0002-0214-5356 (2020). An Automated Framework for Plant Detection Based on Deep Simulated Learning from Drone Imagery. Remote Sensing , vol. 12 , nº 21. p. 3521. DOI: 10.3390/rs12213521.

[thumbnail of P3837.pdf]
Prévisualisation
PDF
Télécharger (27MB) | Prévisualisation

Résumé

Traditional mapping and monitoring of agricultural fields are expensive, laborious, and may contain human errors. Technological advances in platforms and sensors, followed by artificial intelligence (AI) and deep learning (DL) breakthroughs in intelligent data processing, led to improving the remote sensing applications for precision agriculture (PA). Therefore, technological advances in platforms and sensors and intelligent data processing methods, such as machine learning and DL, and geospatial and remote sensing technologies, have improved the quality of agricultural land monitoring for PA needs. However, providing ground truth data for model training is a time-consuming and tedious task and may contain multiple human errors. This paper proposes an automated and fully unsupervised framework based on image processing and DL methods for plant detection in agricultural lands from very high-resolution drone remote sensing imagery. The proposed framework’s main idea is to automatically generate an unlimited amount of simulated training data from the input image. This capability is advantageous for DL methods and can solve their biggest drawback, i.e., requiring a considerable amount of training data. This framework’s core is based on the faster regional convolutional neural network (R-CNN) with the backbone of ResNet-101 for object detection. The proposed framework’s efficiency was evaluated by two different image sets from two cornfields, acquired by an RGB camera mounted on a drone. The results show that the proposed method leads to an average counting accuracy of 90.9%. Furthermore, based on the average Hausdorff distance (AHD), an average object detection localization error of 11 pixels was obtained. Additionally, by evaluating the object detection metrics, the resulting mean precision, recall, and F1 for plant detection were 0.868, 0.849, and 0.855, respectively, which seem to be promising for an unsupervised plant detection method.

Type de document: Article
Mots-clés libres: plant detection; deep learning; faster R-CNN; ResNet-101; drone imagery; precision agriculture
Centre: Centre Eau Terre Environnement
Date de dépôt: 03 févr. 2021 19:32
Dernière modification: 08 févr. 2022 21:49
URI: https://espace.inrs.ca/id/eprint/11212

Gestion Actions (Identification requise)

Modifier la notice Modifier la notice