TY - JOUR
T1 - Deep convolutional neural network for damaged vegetation segmentation from RGB images based on virtual NIR-channel estimation
AU - Picon, Artzai
AU - Bereciartua-Perez, Arantza
AU - Eguskiza, Itziar
AU - Romero-Rodriguez, Javier
AU - Jimenez-Ruiz, Carlos Javier
AU - Eggers, Till
AU - Klukas, Christian
AU - Navarra-Mestre, Ramon
N1 - Publisher Copyright:
© 2022 The Authors
PY - 2022/1
Y1 - 2022/1
N2 - Performing accurate and automated semantic segmentation of vegetation is a first algorithmic step towards more complex models that can extract accurate biological information on crop health, weed presence and phenological state, among others. Traditionally, models based on normalized difference vegetation index (NDVI), near infrared channel (NIR) or RGB have been a good indicator of vegetation presence. However, these methods are not suitable for accurately segmenting vegetation showing damage, which precludes their use for downstream phenotyping algorithms. In this paper, we propose a comprehensive method for robust vegetation segmentation in RGB images that can cope with damaged vegetation. The method consists of a first regression convolutional neural network to estimate a virtual NIR channel from an RGB image. Second, we compute two newly proposed vegetation indices from this estimated virtual NIR: the infrared-dark channel subtraction (IDCS) and infrared-dark channel ratio (IDCR) indices. Finally, both the RGB image and the estimated indices are fed into a semantic segmentation deep convolutional neural network to train a model to segment vegetation regardless of damage or condition. The model was tested on 84 plots containing thirteen vegetation species showing different degrees of damage and acquired over 28 days. The results show that the best segmentation is obtained when the input image is augmented with the proposed virtual NIR channel (F1=0.94) and with the proposed IDCR and IDCS vegetation indices (F1=0.95) derived from the estimated NIR channel, while the use of only the image or RGB indices lead to inferior performance (RGB(F1=0.90) NIR(F1=0.82) or NDVI(F1=0.89) channel). The proposed method provides an end-to-end land cover map segmentation method directly from simple RGB images and has been successfully validated in real field conditions.
AB - Performing accurate and automated semantic segmentation of vegetation is a first algorithmic step towards more complex models that can extract accurate biological information on crop health, weed presence and phenological state, among others. Traditionally, models based on normalized difference vegetation index (NDVI), near infrared channel (NIR) or RGB have been a good indicator of vegetation presence. However, these methods are not suitable for accurately segmenting vegetation showing damage, which precludes their use for downstream phenotyping algorithms. In this paper, we propose a comprehensive method for robust vegetation segmentation in RGB images that can cope with damaged vegetation. The method consists of a first regression convolutional neural network to estimate a virtual NIR channel from an RGB image. Second, we compute two newly proposed vegetation indices from this estimated virtual NIR: the infrared-dark channel subtraction (IDCS) and infrared-dark channel ratio (IDCR) indices. Finally, both the RGB image and the estimated indices are fed into a semantic segmentation deep convolutional neural network to train a model to segment vegetation regardless of damage or condition. The model was tested on 84 plots containing thirteen vegetation species showing different degrees of damage and acquired over 28 days. The results show that the best segmentation is obtained when the input image is augmented with the proposed virtual NIR channel (F1=0.94) and with the proposed IDCR and IDCS vegetation indices (F1=0.95) derived from the estimated NIR channel, while the use of only the image or RGB indices lead to inferior performance (RGB(F1=0.90) NIR(F1=0.82) or NDVI(F1=0.89) channel). The proposed method provides an end-to-end land cover map segmentation method directly from simple RGB images and has been successfully validated in real field conditions.
KW - Vegetation indices estimation
KW - Vegetation coverage map
KW - Near infrared estimation
KW - Convolutional neural network
KW - Deep learning
KW - Vegetation indices estimation
KW - Vegetation coverage map
KW - Near infrared estimation
KW - Convolutional neural network
KW - Deep learning
UR - http://www.scopus.com/inward/record.url?scp=85139369521&partnerID=8YFLogxK
U2 - 10.1016/j.aiia.2022.09.004
DO - 10.1016/j.aiia.2022.09.004
M3 - Article
SN - 2589-7217
VL - 6
SP - 199
EP - 210
JO - Artificial Intelligence in Agriculture
JF - Artificial Intelligence in Agriculture
ER -