Fully-automated root image analysis (faRIA)

. 2021 Aug 06 ; 11 (1) : 16047. [epub] 20210806

Status PubMed-not-MEDLINE Jazyk angličtina Země Velká Británie, Anglie Médium electronic

Typ dokumentu časopisecké články, práce podpořená grantem

Perzistentní odkaz   https://www.medvik.cz/link/pmid34362967
Odkazy

PubMed 34362967
PubMed Central PMC8346561
DOI 10.1038/s41598-021-95480-y
PII: 10.1038/s41598-021-95480-y
Knihovny.cz E-zdroje

High-throughput root phenotyping in the soil became an indispensable quantitative tool for the assessment of effects of climatic factors and molecular perturbation on plant root morphology, development and function. To efficiently analyse a large amount of structurally complex soil-root images advanced methods for automated image segmentation are required. Due to often unavoidable overlap between the intensity of fore- and background regions simple thresholding methods are, generally, not suitable for the segmentation of root regions. Higher-level cognitive models such as convolutional neural networks (CNN) provide capabilities for segmenting roots from heterogeneous and noisy background structures, however, they require a representative set of manually segmented (ground truth) images. Here, we present a GUI-based tool for fully automated quantitative analysis of root images using a pre-trained CNN model, which relies on an extension of the U-Net architecture. The developed CNN framework was designed to efficiently segment root structures of different size, shape and optical contrast using low budget hardware systems. The CNN model was trained on a set of 6465 masks derived from 182 manually segmented near-infrared (NIR) maize root images. Our experimental results show that the proposed approach achieves a Dice coefficient of 0.87 and outperforms existing tools (e.g., SegRoot) with Dice coefficient of 0.67 by application not only to NIR but also to other imaging modalities and plant species such as barley and arabidopsis soil-root images from LED-rhizotron and UV imaging systems, respectively. In summary, the developed software framework enables users to efficiently analyse soil-root images in an automated manner (i.e. without manual interaction with data and/or parameter tuning) providing quantitative plant scientists with a powerful analytical tool.

Zobrazit více v PubMed

Lynch J. Root architecture and plant productivity. Plant Physiol. 1995;109:7. doi: 10.1104/pp.109.1.7. PubMed DOI PMC

Iyer-Pascuzzi AS, et al. Imaging and analysis platform for automatic phenotyping and trait ranking of plant root systems. Plant Physiol. 2010;152:1148–1157. doi: 10.1104/pp.109.150748. PubMed DOI PMC

Trachsel S, Kaeppler SM, Brown KM, Lynch JP. Shovelomics: High throughput phenotyping of maize (Zea mays L.) root architecture in the field. Plant Soil. 2011;341:75–87. doi: 10.1007/s11104-010-0623-8. DOI

Bengough A, Mullins C. Penetrometer resistance, root penetration resistance and root elongation rate in two sandy loam soils. Plant Soil. 1991;131:59–66. doi: 10.1007/BF00010420. DOI

Wojciechowski T, Gooding M, Ramsay L, Gregory P. The effects of dwarfing genes on seedling root growth of wheat. J. Exp. Bot. 2009;60:2565–2573. doi: 10.1093/jxb/erp107. PubMed DOI PMC

Watt M, et al. A rapid, controlled-environment seedling root screen for wheat correlates well with rooting depths at vegetative, but not reproductive, stages at two field sites. Ann. Bot. 2013;112:447–455. doi: 10.1093/aob/mct122. PubMed DOI PMC

Perret J, Al-Belushi M, Deadman M. Non-destructive visualization and quantification of roots using computed tomography. Soil Biol. Biochem. 2007;39:391–399. doi: 10.1016/j.soilbio.2006.07.018. DOI

Tracy SR, et al. The x-factor: Visualizing undisturbed root architecture in soils using X-ray computed tomography. J. Exp. Bot. 2010;61:311–313. doi: 10.1093/jxb/erp386. PubMed DOI

van der Weerd L, et al. Quantitative NMR microscopy of osmotic stress responses in maize and pearl millet. J. Exp. Bot. 2001;52:2333–2343. doi: 10.1093/jexbot/52.365.2333. PubMed DOI

Fang S, Yan X, Liao H. 3d reconstruction and dynamic modeling of root architecture in situ and its application to crop phosphorus research. Plant J. 2009;60:1096–1108. doi: 10.1111/j.1365-313X.2009.04009.x. PubMed DOI

Zeng G, Birchfield ST, Wells CE. Automatic discrimination of fine roots in minirhizotron images. New Phytol. 2008;177:549–557. doi: 10.1111/j.1469-8137.2007.02271.x. PubMed DOI

Johnson MG, Tingey DT, Phillips DL, Storm MJ. Advancing fine root research with minirhizotrons. Environ. Exp. Bot. 2001;45:263–289. doi: 10.1016/S0098-8472(01)00077-6. PubMed DOI

Van de Geijn S, Vos J, Groenwold J, Goudriaan J, Leffelaar P. The wageningen rhizolab-a facility to study soil–root–shoot–atmosphere interactions in crops. Plant Soil. 1994;161:275–287. doi: 10.1007/BF00046399. DOI

Huck, M. G. & Taylor, H. M. The rhizotron as a tool for root research. In Advances in Agronomy Vol. 35 (ed. Sparks, D. L.) 1–35 (Elsevier, 1982).

Eshel A, Beeckman T. Plant Roots: The Hidden Half. CRC Press; 2013.

Nagel KA, et al. Growscreen-rhizo is a novel phenotyping robot enabling simultaneous measurements of root and shoot growth for plants grown in soil-filled rhizotrons. Funct. Plant Biol. 2012;39:891–904. doi: 10.1071/FP12023. PubMed DOI

Junker A, et al. Optimizing experimental procedures for quantitative evaluation of crop plant performance in high throughput phenotyping systems. Front. Plant Sci. 2015;5:770. doi: 10.3389/fpls.2014.00770. PubMed DOI PMC

Shi R, Junker A, Seiler C, Altmann T. Phenotyping roots in darkness: Disturbance-free root imaging with near infrared illumination. Funct. Plant Biol. 2018;45:400–411. doi: 10.1071/FP17262. PubMed DOI

Armengaud P, et al. Ez-rhizo: Integrated software for the fast and accurate measurement of root system architecture. Plant J. 2009;57:945–956. doi: 10.1111/j.1365-313X.2008.03739.x. PubMed DOI

Pace J, Lee N, Naik HS, Ganapathysubramanian B, Lübberstedt T. Analysis of maize (Zea mays L.) seedling roots with the high-throughput image analysis tool aria (automatic root image analysis) PLoS ONE. 2014;9:e108255. doi: 10.1371/journal.pone.0108255. PubMed DOI PMC

Le Bot J, et al. Dart: A software to analyse root system architecture and development from captured images. Plant Soil. 2010;326:261–273. doi: 10.1007/s11104-009-0005-2. DOI

Arsenault J-L, Poulcur S, Messier C, Guay R. Winrhlzo DOI

Bontpart T, et al. Affordable and robust phenotyping framework to analyse root system architecture of soil-grown plants. Plant J. 2020;103:2330–2343. doi: 10.1111/tpj.14877. PubMed DOI

Galkovskyi T, et al. Gia roots: Software for the high throughput analysis of plant root system architecture. BMC Plant Biol. 2012;12:116. doi: 10.1186/1471-2229-12-116. PubMed DOI PMC

Pierret A, Gonkhamdee S, Jourdan C, Maeght J-L. Ij\_rhizo: An open-source software to measure scanned images of root samples. Plant Soil. 2013;373:531–539. doi: 10.1007/s11104-013-1795-9. DOI

Narisetti N, et al. Semi-automated root image analysis (saRIA) Sci. Rep. 2019;9:1–10. doi: 10.1038/s41598-019-55876-3. PubMed DOI PMC

Lobet G, Pagès L, Draye X. A novel image-analysis toolbox enabling quantitative analysis of root system architecture. Plant Physiol. 2011;157:29–39. doi: 10.1104/pp.111.179895. PubMed DOI PMC

Cai J, et al. Rootgraph: A graphic optimization tool for automated image analysis of plant roots. J. Exp. Bot. 2015;66:6551–6562. doi: 10.1093/jxb/erv359. PubMed DOI PMC

Zheng L, Yang Y, Tian Q. Sift meets CNN: A decade survey of instance retrieval. IEEE Trans. Pattern Anal. Mach. Intell. 2017;40:1224–1244. doi: 10.1109/TPAMI.2017.2709749. PubMed DOI

Ronneberger, Olaf, Philipp Fischer, and Thomas Brox. "U-net: Convolutional networks for biomedical image segmentation." International Conference on Medical image computing and computer-assisted intervention. Springer, Cham, 2015.

Bai, W. et al. Human-level CMR image analysis with deep fully convolutional networks. arXiv https://arxiv.org/abs/1710.09289 (2018).

Badrinarayanan V, Kendall A, Cipolla R. Segnet: A deep convolutional encoder–decoder architecture for image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2017;39:2481–2495. doi: 10.1109/TPAMI.2016.2644615. PubMed DOI

Marmanis D, et al. Semantic segmentation of aerial images with an ensemble of CNSS. ISPRS Ann. Photogr. Remote Sens. Spatial Inf. Sci. 2016;2016(3):473–480. doi: 10.5194/isprs-annals-III-3-473-2016. DOI

Pound MP, et al. Deep machine learning provides state-of-the-art performance in image-based plant phenotyping. Gigascience. 2017;6:gix083. doi: 10.1093/gigascience/gix083. PubMed DOI PMC

Douarre C, Schielein R, Frindel C, Gerth S, Rousseau D. Transfer learning from synthetic data applied to soil-root segmentation in X-ray tomography images. J. Imaging. 2018;4:65. doi: 10.3390/jimaging4050065. DOI

Misra T, et al. Spikesegnet-a deep learning approach utilizing encoder-decoder network with hourglass for spike segmentation and counting in wheat plant from visual imaging. Plant Methods. 2020;16:1–20. doi: 10.1186/s13007-020-00582-9. PubMed DOI PMC

Wang R, Cao S, Ma K, Zheng Y, Meng D. Pairwise learning for medical image segmentation. Med. Image Anal. 2021;67:101876. doi: 10.1016/j.media.2020.101876. PubMed DOI

Karani N, Erdil E, Chaitanya K, Konukoglu E. Test-time adaptable neural networks for robust medical image segmentation. Med. Image Anal. 2021;68:101907. doi: 10.1016/j.media.2020.101907. PubMed DOI

Khan S, et al. Deepsmoke: Deep learning model for smoke detection and segmentation in outdoor environments. Exp. Syst. Appl. 2021 doi: 10.1016/j.eswa.2021.115125. DOI

Jiang Y, Li C. Convolutional neural networks for image-based high-throughput plant phenotyping: A review. Plant Phenom. 2020;2020:22. doi: 10.34133/2020/4152816. PubMed DOI PMC

Zhu, Yezi, et al. "Data Augmentation using Conditional Generative Adversarial Networks for Leaf Counting in Arabidopsis Plants." BMVC. 2018.

Chen, J. & Shi, X. A sparse convolutional predictor with denoising autoencoders for phenotype prediction. In Proceedings of the 10th ACM International Conference on Bioinformatics, Computational Biology and Health Informatics, 217–222 (2019).

Yasrab R, et al. RootNav 2.0: Deep learning for automatic navigation of complex plant root architectures. GigaScience. 2019;8:Giz123. doi: 10.1093/gigascience/giz123. PubMed DOI PMC

Wang T, et al. Segroot: A high throughput segmentation method for root image analysis. Comput. Electron. Agric. 2019;162:845–854. doi: 10.1016/j.compag.2019.05.017. DOI

Yasrab R, Pound MP, French AP, Pridmore TP. Rootnet: A convolutional neural networks for complex plant root phenotyping from high-definition datasets. bioRxiv. 2020 doi: 10.1101/2020.05.01.073270. DOI

Ioffe, S. & Szegedy, C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. arXiv preprint https://arxiv.org/abs/1502.03167 (2015).

Santurkar, S., Tsipras, D., Ilyas, A. & Madry, A. How does batch normalization help optimization? https://arxiv.org/abs/1805.11604 (2019).

Li, X., Chen, S., Hu, X. & Yang, J. Understanding the disharmony between dropout and batch normalization by variance shift. https://arxiv.org/abs/1801.05134 (2018).

Peng, C., Zhang, X., Yu, G., Luo, G. & Sun, J. Large kernel matters—improve semantic segmentation by global convolutional network. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2017).

Jha RR, Jaswal G, Gupta D, Saini S, Nigam A. Pixisegnet: Pixel-level iris segmentation network using convolutional encoder–decoder with stacked hourglass bottleneck. IET Biom. 2020;9:11–24. doi: 10.1049/iet-bmt.2019.0025. DOI

Agostinelli, F., Hoffman, M., Sadowski, P. & Baldi, P. Learning activation functions to improve deep neural networks. arXiv preprint https://arxiv.org/abs/1412.6830 (2014).

Wang, L., Guo, S., Huang, W. & Qiao, Y. Places205-vggnet models for scene recognition. arXiv preprint https://arxiv.org/abs/1508.01667 (2015).

Dunne, R. A. & Campbell, N. A. On the pairing of the softmax activation and cross-entropy penalty functions and the derivation of the softmax activation function. In Proceedings ofthe 8th Aust. Conference on the Neural Networks, Melbourne, vol. 181, 185 (Citeseer, 1997).

Zou KH, et al. Statistical validation of image segmentation quality based on a spatial overlap index1: Scientific reports. Acad. Radiol. 2004;11:178–189. doi: 10.1016/S1076-6332(03)00671-8. PubMed DOI PMC

Abadi, M. et al. Tensorflow: Large-scale machine learning on heterogeneous distributed systems. arXiv preprint https://arxiv.org/abs/1603.04467 (2016).

Tian C, Xu Y, Zuo W. Image denoising using deep CNN with batch renormalization. Neural Netw. 2020;121:461–473. doi: 10.1016/j.neunet.2019.08.022. PubMed DOI

Tian C, et al. Deep learning on image denoising: An overview. Neural Netw. 2020;131:251–275. doi: 10.1016/j.neunet.2020.07.025. PubMed DOI

Walt S, Colbert SC, Varoquaux G. The numpy array: A structure for efficient numerical computation. Comput. Scie. Eng. 2011;13:22–30. doi: 10.1109/MCSE.2011.37. DOI

Van der Walt S, et al. Scikit-image: Image processing in python. PeerJ. 2014;2:e453. doi: 10.7717/peerj.453. PubMed DOI PMC

Kingma, D. P. & Ba, J. Adam: A method for stochastic optimization. arXiv preprint https://arxiv.org/abs/1412.6980 (2014).

Krizhevsky, A., Sutskever, I. & Hinton, G. E. Imagenet classification with deep convolutional neural networks. In Advances in Neural Information Processing Systems, 1097–1105 (2012).

Mathworks. Matlab and Statistics Toolbox Release 2019b (The MathWorks, 2019).

Bovik AC. Chapter 3—Basic gray level image processing. In: Bovik A, editor. The Essential Guide to Image Processing. Boston: Academic Press; 2009. pp. 43–68.

Najít záznam

Citační ukazatele

Nahrávání dat ...

Možnosti archivace

Nahrávání dat ...