Hostname: page-component-586b7cd67f-r5fsc Total loading time: 0 Render date: 2024-11-22T15:00:22.723Z Has data issue: false hasContentIssue false

Detection of Carolina Geranium (Geranium carolinianum) Growing in Competition with Strawberry Using Convolutional Neural Networks

Published online by Cambridge University Press:  04 December 2018

Shaun M. Sharpe
Affiliation:
Postdoctoral Associate, University of Florida, Gulf Coast Research and Education Center, Wimauma, FL, USA
Arnold W. Schumann
Affiliation:
Professor, University of Florida, Citrus Research and Education Center, Lake Alfred, FL, USA
Nathan S. Boyd*
Affiliation:
Associate Professor, University of Florida, Gulf Coast Research and Education Center, Wimauma, FL, USA
*
Author for correspondence: Nathan S. Boyd, University of Florida, Gulf Coast Research and Education Center, 14625 Count Road 672, Wimauma, FL 33598. (Email: [email protected])

Abstract

Weed interference during crop establishment is a serious concern for Florida strawberry [Fragaria×ananassa (Weston) Duchesne ex Rozier (pro sp.) [chiloensis×virginiana]] producers. In situ remote detection for precision herbicide application reduces both the risk of crop injury and herbicide inputs. Carolina geranium (Geranium carolinianum L.) is a widespread broadleaf weed within Florida strawberry production with sensitivity to clopyralid, the only available POST broadleaf herbicide. Geranium carolinianum leaf structure is distinct from that of the strawberry plant, which makes it an ideal candidate for pattern recognition in digital images via convolutional neural networks (CNNs). The study objective was to assess the precision of three CNNs in detecting G. carolinianum. Images of G. carolinianum growing in competition with strawberry were gathered at four sites in Hillsborough County, FL. Three CNNs were compared, including object detection–based DetectNet, image classification–based VGGNet, and GoogLeNet. Two DetectNet networks were trained to detect either leaves or canopies of G. carolinianum. Image classification using GoogLeNet and VGGNet was largely unsuccessful during validation with whole images (Fscore<0.02). CNN training using cropped images increased G. carolinianum detection during validation for VGGNet (Fscore=0.77) and GoogLeNet (Fscore=0.62). The G. carolinianum leaf–trained DetectNet achieved the highest Fscore (0.94) for plant detection during validation. Leaf-based detection led to more consistent detection of G. carolinianum within the strawberry canopy and reduced recall-related errors encountered in canopy-based training. The smaller target of leaf-based DetectNet did increase false positives, but such errors can be overcome with additional training images for network desensitization training. DetectNet was the most viable CNN tested for image-based remote sensing of G. carolinianum in competition with strawberry. Future research will identify the optimal approach for in situ detection and integrate the detection technology with a precision sprayer.

Type
Research Article
Copyright
© Weed Science Society of America, 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Boyd, NS, Dittmar, P (2015) Impact of application time and clopyralid rate on strawberry growth and yield. Weed Technol 29:821826Google Scholar
Cireşan, D, Meier, U, Schmidhuber, J (2011) A committee of neural networks for traffic sign classification. Pages 19181921 in The International Joint Conference on Neural Networks 2011 Conference Proceedings. San Jose, CA: IEEE Computational Intelligence Society and the International Neural Network SocietyGoogle Scholar
Dyrmann, M, Jørgensen, RN, Midtiby, HS (2017) RoboWeedSupport—detection of weed locations in leaf occluded cereal crops using a fully convolutional neural network. Adv Anim Biosci 8:842847Google Scholar
Dyrmann, M, Skovsen, S, Laursen, MS, Jørgensen, RN (2018) Using a fully convolutional neural network for detecting locations of weeds in images from cereal fields. Pages 17 in The 14th International Conference on Precision Agriculture. Montreal, QC: International Society of Precision AgricultureGoogle Scholar
Fennimore, SA, Slaughter, DC, Siemens, MC, Leon, RG, Saber, MN (2016) Technology for automation of weed control in specialty crops. Weed Technol 30:823837Google Scholar
Figueroa, RA, Doohan, DJ (2006) Selectivity and efficacy of clopyralid on strawberry (Fragaria X ananassa). Weed Technol 20:101103Google Scholar
Ge, Z, McCool, C, Sanderson, C, Corke, P (2015) Subset feature learning for fine-grained classification. Pages 4652 in Proceedings of the IEEE Conference on Computer Vision and Pattern Recogniton Workshop. Boston: Institute of Electrical and Electronics EngineersGoogle Scholar
Geiger, A, Lenz, P, Stiller, C, Urtasun, R (2013) Vision meets robotics: the KITTI dataset. Int J Rob Res 32:12311237Google Scholar
Gu, J, Wang, Z, Kuen, J, Ma, L, Shahroudy, A, Shuai, B, Liu, T, Wang, X, Wang, G, Cai, J, Chen, T (2017) Recent advances in convolutional neural networks. Pattern Recognit 77:354377Google Scholar
Hochreiter, S, Bengio, Y, Frasconi, P, Schmidhuber, J (2001) Gradient flow in recurrent nets: the difficulty of learning long-term dependencies. Pages 115 in Kremer S, Kolen J, eds. A Field Guide to Dynamical Recurrent Neural Networks. Piscataway, NJ: Wiley-IEEE PressGoogle Scholar
Hoiem, D, Chodpathumwan, Y, Dai, Q (2012) Diagnosing error in object detectors. Pages 340353 in Fitzgibbon A, Lazebnik S, Perona P, Sato Y, Schmid C, eds. Computer Vision—ECCV 2012. Berlin: SpringerGoogle Scholar
Jia, D, Wei, D, Socher, R, Li-Jia, L, Kai, L, Li, F-F (2009) ImageNet: a large-scale hierarchical image database. Pages 248255 in 2009 IEEE Conference on Computer Vision and Pattern Recognition. Miami Beach, FL: Institute of Electrical and Electronics EngineersGoogle Scholar
Jia, Y, Shelhamer, E, Donahue, J, Karayev, S, Long, J, Girshick, R, Guadarrama, S, Darrell, T (2014) Caffe: Convolutional Architecture for Fast Feature Embedding. Pages 675678 in Proceedings of the 22nd ACM International Conference on Multimedia. Orlando, FL: Association for Computing MachineryGoogle Scholar
Liu, W, Anguelov, D, Erhan, D, Szegedy, C, Reed, S, Fu, C, Berg, AC (2016) SSD: single shot multibox detector. https://arxiv.org/abs/1512.02325. Accessed: March 12, 2018Google Scholar
McMurray, GL, Monks, DW, Leidy, RB (1996) Clopyralid use in strawberries (Fragaria x ananassa Duch.) grown on plastic mulch. Weed Sci 44:350354Google Scholar
Milioto, A, Lottes, P, Stachniss, C (2017) Real-time blob-wise sugar beet vs weeds classification for monitoring fields using convolutional neural networks. Pages 4148 in ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Bonn, Germany: International Society for Photogrammetry and Remote SensingGoogle Scholar
Redmon, J, Divvala, S, Girshick, R, Farhadi, A (2016) You only look once: unified, real-time object detection. Pages 779-788 in Proceedings of the 29th IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas, NV: IEEE Computer SocietyGoogle Scholar
Schmidhuber, J (2015) Deep learning in neural networks: an overview. Neural Networks 61:85117Google Scholar
Sharpe, SM (2008) Potential for Hyperspectral Technology in Wild Blueberry (Vaccinium angustifolium Ait.) Production. MS thesis. Halifax, NS, Canada: Dalhousie University. 129 pGoogle Scholar
Sharpe, SM, Boyd, NS, Dittmar, PJ, MacDonald, GE, Darnell, RL (2018a) Clopyralid tolerance in strawberry and feasibility of early applications in Florida. Weed Sci 66:508515Google Scholar
Sharpe, SM, Boyd, NS, Dittmar, PJ, MacDonald, GE, Darnell, RL (2018b) Effect of temperature on clopyralid safety in strawberry. Weed Technol 32:347351Google Scholar
Simonyan, K, Zisserman, A (2015) Very deep convolutional networks for large-scale image recognition. Pages 114 in Proceedings of the 2015 International Conference on Learning Representations, San Diego, CA: Computational and Biological Learning Society.Google Scholar
Slaughter, DC, Giles, DK, Downey, D (2008a) Autonomous robotic weed control systems: a review. Comput Electron Agric 61:6378Google Scholar
Slaughter, DC, Giles, DK, Fennimore, SA, Smith, RF (2008b) Multispectral machine vision identification of lettuce and weed seedlings for automated weed control. Weed Technol 22:378384Google Scholar
Sokolova, M, Lapalme, G (2009) A systematic analysis of performance measures for classification tasks. Inf Process Manag 45:427437Google Scholar
Szegedy, C, Liu, W, Jia, Y, Sermanet, P, Reed, S, Anguelov, D, Erhan, D, Vanhoucke, V, Rabinovich, A (2014) Going Deeper with Convolutions. http://arxiv.org/abs/1409.4842. Accessed: May 11, 2018Google Scholar
Szegedy, C, Vanhoucke, V, Ioffe, S, Shlens, J, Wojna, Z (2016) Rethinking the inception architecture for computer vision. Pages 2730 in Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas, NV: Institute of Electrical and Electronics EngineersGoogle Scholar
Tang, JL, Wang, D, Zhang, ZG, He, LJ, Xin, J, Xu, Y (2017) Weed identification based on K-means feature learning combined with convolutional neural network. Comput Electron Agric 135:6370Google Scholar
Tao, A, Barker, J, Sarathy, S (2016) DetectNet: Deep Neural Network for Object Detection in DIGITS. https://devblogs.nvidia.com/detectnet-deep-neural-network-object-detection-digits. Accessed: May 11, 2018Google Scholar
Teimouri, N, Dyrmann, M, Nielsen, P, Mathiassen, S, Somerville, G, Jørgensen, R (2018) Weed growth stage estimator using deep convolutional neural networks. Sensors 18:1580Google Scholar
Tian, L, Slaughter, DC, Norris, RF (1997) Outdoor field machine vision identification of tomato seedlings for automated weed control. Trans ASAE 40:17611768Google Scholar
Tillett, ND, Hague, T, Grundy, AC, Dedousis, AP (2008) Mechanical within-row weed control for transplanted crops using computer vision. Biosyst Eng 99:171178Google Scholar
Vrindts, E, Baerdemaeker, JDE, Ramon, H (2002) Weed detection using canopy reflection. Precis Agric 3:6380Google Scholar
Webster, TM (2014) Weed survey—southern states 2014. Vegetable, fruit and nut crop subsection. Page 288 in Proceedings of the Southern Weed Science Society 67th Annual Meeting. Birmingham, AL: Southern Weed Science SocietyGoogle Scholar
Zeiler, MD (2012) ADADELTA: An Adaptive Learning Rate Method. http://arxiv.org/abs/1212.5701. Accessed: May 14, 2018Google Scholar
Zhang, Y, Slaughter, DC, Staab, ES (2012a) Robust hyperspectral vision-based classification for multi-season weed mapping. ISPRS J Photogramm Remote Sens 69:6573Google Scholar
Zhang, Y, Staab, ES, Slaughter, DC, Giles, DK, Downey, D (2012b) Automated weed control in organic row crops using hyperspectral species identification and thermal micro-dosing. Crop Prot 41:96105Google Scholar
Zijlstra, C, Lund, I, Justesen, AF, Nicolaisen, M, Jensen, PK, Bianciotto, V, Posta, K, Balestrini, R, Przetakiewicz, A, Czembor, E, de Zande, J van (2011) Combining novel monitoring tools and precision application technologies for integrated high-tech crop protection in the future (a discussion document). Pest Manag Sci 67:616625Google Scholar