Un nouvel article de revue, un article de conférence et un ensemble de données d’Elias Ennadifi ont été publiés le 22/07
L’article a été accepté pour publication dans la revue Elsevier Computers and Electronics in agriculture
Title
Deep Learning for Wheat Ear Segmentation and Ear Density Measurement: From Heading to Maturity
Auteurs
Sébastien Dandrifosse*, Elias Ennadifi* (* Les auteurs partagent la première paternité), Alexis Carlier, Bernard Gosselin, Benjamin Dumont, Benoît Mercatoris
Résumé
Recent deep learning methods have allowed important steps forward in the automatic detection of wheat ears in the field. Nevertheless, it was still lacking a method able to both count and segment the ears, validated at all the development stages from heading to maturity. Moreover, the critical step of converting the ear count in an image to an ear density, i.e. a number of ears per square metre in the field, has been widely ignored by most of the previous studies. For this research, wheat RGB images have been acquired from heading to maturity in two field trials displaying contrasted fertilisation scenarios. An unsupervised learning approach on the YOLOv5 model, as well as the cutting-edge DeepMAC segmentation method were exploited to develop a wheat ear counting and segmentation pipeline that necessitated only a limited amount of labelling work for the training. An additional label set including all the development stages was built for validation. The average F1 score of ear bounding box detection was 0.93 and the average F1 score of segmentation was 0.86. To convert the ear counts to ear densities, a second RGB camera was used so that the distance between the cameras and the ears could be measured by stereovision. That distance was exploited to compute the image footprint at ear level, and thus divide the number of ears by this footprint to get the ear density. The obtained ear densities were coherent regarding the fertilisation scenarios but, for a same fertilisation, differences were observed between acquisition dates. This highlights that the measurement was not able to retrieve absolute ear densities for all the development stages and conditions. The deep learning measurement considered the most reliable outperformed observations from three human operators.
L’article: https://doi.org/10.1016/j.compag.2022.107161
Un autre article connexe a été présenté lors de la 15e Conférence internationale sur l’agriculture de précision qui s’est tenue aux États-Unis du 26 au 29/06/2022
Title
Effect of the sun on the measurement of wheat ear
density by deep learning
Auteurs
Sébastien Dandrifosse*, Elias Ennadifi* (* Authors share first-authorship), Alexis Carlier, Bernard Gosselin, Benjamin Dumont, Benoît Mercatoris
Résumé
Ear density in the field, i.e. the number of ears per square meter, is one of the yield components of wheat and therefore a variable of high agronomic interest. Its traditional measurement necessitates laborious human observations in the field or destructive sampling. In recent years,
deep learning based on RGB images has been identified as a low-cost, robust and highthroughput alternative to measure this variable. However, most of the studies were limited to the computer challenge of counting the ears in the images, without aiming to convert those counts into ear density. The aim of this study was to propose a method for automatic measurement of ear density, but also to evaluate the potential impact of the sun on the measurement. A same zone of a wheat plot has been imaged by two nadir RGB cameras all over the daily course of the sun, this repeated at flowering, watery ripe, medium milk and hard dough development stages. The bounding boxes of the ears in the images were detected using the YOLOv5 deep learning model, trained on rich existing wheat ear datasets. The shifts between the same elements observed in the images from the two cameras were exploited to compute the image footprint by stereovision. The ear count divided by the image footprint yielded the ear density. To investigate the effect of the sun, a solar spectrum was recorded thanks to a spectrometer at the time of each image acquisition. The F1 scores of ear bounding box detection at flowering, watery ripe, medium milk and hard dough were respectively 0.87, 0.92, 0.92 and 0.85. At watery ripe and medium milk, the measured ear density was robust during the day and between the two dates. At hard dough stage, increases of sunlight irradiance correlated with decreases of the number of ears detected by deep learning, but also with decreases of the number of ears labeled by humans. This demonstrates that, in some conditions, the wheat ear detection performance indicators based on labeled ear may be misleading regarding the capacity of the machine vision to measure the real ear density.
Link: https://orbi.umons.ac.be/bitstream/20.500.12907/43077/1/ICPA_Elias_Ennadifi.pdf
ISIA Lab publications: https://opendata.umons.ac.be/en/publications/services/html/f105.html