Please wait a minute...
Journal of Integrative Agriculture  2020, Vol. 19 Issue (10): 2500-2513    DOI: 10.1016/S2095-3119(20)63168-9
Special Issue: 智慧植保合辑Smart Plant Protection
Plant Protection Advanced Online Publication | Current Issue | Archive | Adv Search |
Development of an automatic monitoring system for rice light-trap pests based on machine vision
YAO Qing1, FENG Jin1, TANG Jian2, XU Wei-gen3, ZHU Xu-hua4, YANG Bao-jun2, LÜ Jun1, XIE Yi-ze5, YAO Bo1, WU Shu-zhen1, KUAI Nai-yang1, WANG Li-jun6
1 School of Information Science and Technology, Zhejiang Sci-Tech University, Hangzhou 310018, P.R.China
2 State Key Laboratory of Rice Biology, China National Rice Research Institute, Hangzhou 310006, P.R.China
3 Plant Protection, Quarantine and Pesticide Management Station of Zhejiang, Hangzhou 310020, P.R.China
4 Zhejiang Top Cloud-agri Technology Co., Ltd., Hangzhou 310015, P.R.China
5 Agricultural Technology Extension Center of Shangyu, Shaoxing 312300, P.R.China
6 Agricultural Technology Extension Center of Keerqin, Keerqin 137713, P.R.China
Download:  PDF in ScienceDirect  
Export:  BibTeX | EndNote (RIS)      
Abstract  
Monitoring pest populations in paddy fields is important to effectively implement integrated pest management.  Light traps are widely used to monitor field pests all over the world.  Most conventional light traps still involve manual identification of target pests from lots of trapped insects, which is time-consuming, labor-intensive and error-prone, especially in pest peak periods.  In this paper, we developed an automatic monitoring system for rice light-trap pests based on machine vision.  This system is composed of an intelligent light trap, a computer or mobile phone client platform and a cloud server.  The light trap firstly traps, kills and disperses insects, then collects images of trapped insects and sends each image to the cloud server.  Five target pests in images are automatically identified and counted by pest identification models loaded in the server.  To avoid light-trap insects piling up, a vibration plate and a moving rotation conveyor belt are adopted to disperse these trapped insects.  There was a close correlation (r=0.92) between our automatic and manual identification methods based on the daily pest number of one-year images from one light trap.  Field experiments demonstrated the effectiveness and accuracy of our automatic light trap monitoring system.
Keywords:  automatic monitoring system        light trap        rice pest        machine vision        image processing        convolutional neural network  
Received: 11 November 2019   Accepted:
Fund: This work was supported by the Fundamental Public Welfare Research Program of Zhejiang Provincial Natural Science Foundation, China (LGN18C140007 and Y20C140024), the National High Technology Research and Development Program of China (863 Program; 2013AA102402) and the Agricultural Science and Technology Innovation Program of Chinese Academy of Agricultural Sciences.
Corresponding Authors:  Correspondence YAO Qing, E-mail: q-yao@zstu.edu.cn; TANG Jian, E-mail: tangjian@caas.cn   

Cite this article: 

YAO Qing, FENG Jin, TANG Jian, XU Wei-gen, ZHU Xu-hua, YANG Bao-jun, Lü Jun, XIE Yi-ze, YAO Bo, WU Shu-zhen, KUAI Nai-yang, WANG Li-jun. 2020. Development of an automatic monitoring system for rice light-trap pests based on machine vision. Journal of Integrative Agriculture, 19(10): 2500-2513.

Berg T, Liu J X, Lee S W, Alexander M L, Belhumeur P N. 2014. Birdsnap: Large-scale fine-grained visual categorization of birds. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition. Ohio, Ameriaca. pp. 2019–2026.
Boominathan L, Kruthiventi S S, Babu R V. 2016. CrowdNet: A deep convolutional network for dense crowd counting. In: Proceedings of the 2016 ACM on Multimedia Conference.Amsterdam, Holland. pp. 640–644.
Bottou L. 2012. Stochastic gradient descent tricks. In Neural Networks: Tricks of the Trade. Springer, Berlin, Heidelberg.pp.421–436.
Buckland M, Gey F. 1994. The relationship between recall and precision. Journal of the American Society for Information Science, 45, 12–19.
Chen G T. 2018. Research on classification and identification of rice planthopper based on deep learning. MSc thesis, Zhejiang Sci-Tech University, China. (in Chinese)
Cheng X, Zhang Y H, Chen Y Q, Wu Y Z, Yue Y. 2017. Pest identification via deep residual learning in complex background. Computers and Electronics in Agriculture, 141, 351–356.
Ding W, Taylor G. 2016. Automatic moth detection from trap images for pest management. Computers and Electronics in Agriculture, 123, 17–28.
Epsky N D, Morrill W L, Mankin R W. 2008. Traps for Capturing Insects. In: Encyclopedia of Entomology. Springer, Netherlands. pp. 3887–3901.
George D R, Collier R, Port G. 2009. Testing and improving the effectiveness of trap crops for management of the diamondback moth Plutella xylostella (L.): A laboratory-based study. Pest Management Science, 65, 1219–1227.
Goutte C, Gaussier E. 2005. A probabilistic interpretation of precision, recall and F-score, with implication for evaluation. In: Proceedings of European Conference on Information Retrieval. Springer, Berlin, Heidelberg. pp. 345–359.
Han D, Liu Q, Fan W. 2018. A new image classification method using CNN transfer learning and web data augmentation. Expert Systems with Applications, 95, 43–56.
He K, Zhang X, Ren S, Sun J. 2016. Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas, America. pp. 770–778.
Hu Y, Chang H, Nian F, Wang Y, Li T. 2016. Dense crowd counting from still images with convolutional neural networks. Journal of Visual Communication and Image Representation, 38, 530–539.
Huang G, Liu Z, Van Der Maaten L, Weinberger K Q. 2017. Densely connected convolutional networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Hawaii, America. pp. 4700–4708.
Jia Y, Shelhamer E, Donahue J, Karayev S, Long J, Girshick R, Darrell T. 2014. Caffe: Convolutional architecture for fast feature embedding. In: Proceedings of the 22nd ACM International Conference on Multimedia. San Francisco, America. pp. 675–678.
Kogan M. 1998. Integrated pest management: Historical perspectives and contemporary developments. Annual Review of Entomology, 43, 243–270.
Larios N, Deng H, Zhang W, Sarpola M, Yuen J, Paasch R, Shapiro L G. 2008. Automated insect identification through concatenated histograms of local appearance features: Feature vector generation and region detection for deformable objects. Machine Vision and Applications, 19, 105–123.
Lin T Y, Roychowdhury A, Maji S. 2015. Bilinear CNN models for fine-grained visual recognition. In: Proceedings of 2015 IEEE International Conference on Computer Vision. Santiago, Chile. pp. 1449–1457.
Liu Z, Gao J, Yang G, Zhang H, He Y. 2016. Localization and classification of paddy field pests using a saliency map and deep convolutional neural network. Scientific Reports, 6, 20410.
Lu S H, Ye S J. 2020. Using an image segmentation and support vector machine method for identifying two locust species and instars. Journal of Integrative Agriculture, 19, 1301–1313.
Probst C, Gethmann J M, Kampen H, Werner D, Conraths F J. 2015. A comparison of four light traps for collecting Culicoides biting midges. Parasitology Research, 114, 4717–4724.
Pun T. 1980. A new method for grey-level picture thresholding using the entropy of the histogram. Signal Processing, 2, 223–237.
Qiu D Y, Zhang H T, Liu X Y, Liu Y N. 2007. Design of detection system for agriculture field pests based on machine vision. Transactions of the Chinese Society of Agricultural Machinery, 38, 120–122. (in Chinese)
Rastegari M, Ordonez V, Redmon J, Farhadi A. 2016. XNOR-Net: Imagenet classification using binary convolutional neural networks. In: European Conference on Computer Vision. Springer, Cham. pp. 525–542.
Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Berg A C. 2015. Imagenet large scale visual recognition challenge. International Journal of Computer Vision, 115, 211–252.
Sharif R A, Azizpour H, Sullivan J, Carlsson S. 2014. CNN features off-the-shelf: An astounding baseline for recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. Columbus, America. pp. 806–813.
Simonyan K, Zisserman A. 2014. Very deep convolutional networks for large-scale image recognition. Proceedings of 2014 International Conference on Learning Representations. Banff, Canada. pp. 1556–1570.
Solis-Sánchez L O, Castañeda-Miranda R, García-Escalante J J, Torres-Pacheco I, Guevara-González R G, Castañeda-Miranda C L, Alaniz-Lumbreras P D. 2011. Scale invariant feature approach for insect monitoring. Computers and Electronics in Agriculture, 75, 92–99.
Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, Rabinovich A. 2015. Going deeper with convolutions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Boston, America. pp. 1–9.
Walach E, Wolf L. 2016. Learning to count with CNN boosting. In: Proceedings of 2016 European Conference on Computer Vision. Amsterdam, Holland. pp. 660–676.
Wang Z B, Wang K Y, Zhang S F, Liu Z Q, Mu C X. 2014 White flies counting with K-means clustering and ellipse fitting. Transactions of the Chinese Society of Agricultural Engineering, 30,105–112. (in Chinese)
Wen C, Guyer D E, Li W. 2009. Local feature-based identification and classification for orchard insects. Biosystems Engineering, 104, 299–307.
Wen C, Guyer D. 2012. Image-based orchard insect automated identification and classification method. Computers and Electronics in Agriculture, 89, 110–115.
Xian D X, Yao Q, Yang B J, Luo J, Tan C, Zhang C, Xu Y C. 2015. Automatic identification of rice light trapped pests based on images. Chinese Journal of Rice Science, 29, 299–304. (in Chinese)
Xie C, Zhang J, Li R, Li J, Hong P, Xia J, Chen P. 2015. Automatic classification for field crop insects via multiple-task sparse representation and multiple-kernel learning. Computers and Electronics in Agriculture, 119, 123–132.
Yaakob S N, Jain L. 2012. An insect classification analysis based on shape features using quality threshold ARTMAP and moment invariant. Applied Intelligence, 37, 12–30.
Yao Q, Lv J, Liu Q J, Diao G Q, Yang B J, Chen H M, Tang J. 2012. An insect imaging system to automate rice light-trap pest identification. Journal of Integrative Agriculture, 11, 978–985.
Yao Q, Xian D X, Liu Q J, Yang B J, Diao G Q, Tan J. 2014. Automated counting of rice planthoppers in paddy fields based on image processing. Journal of Integrative Agriculture, 138, 1736–1745.
Zhang C, Li H, Wang X, Yang X. 2015. Cross-scene crowd counting via deep convolutional neural networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Boston, Ameriaca. pp. 833–841.
Zhao J, Cheng X P. 2007. Field pest identification by an improved Gabor texture segmentation scheme. New Zealand Journal of Agricultural Research, 50, 719–723.
Zheng H L, Fu J L, Mei T, Luo J. 2017. Learning multi-attention convolutional neural network for fine-grained image recognition. In: Proceedings of 2017 IEEE International Conference on Computer Vision. Venice, Italy. pp. 5209–5217.
Zhong Y, Gao J, Lei Q, Zhou Y. 2018. A vision-based counting and recognition system for flying insects in intelligent agriculture. Sensors, 18, 1489–1508.
Zhou M, Zhou M Q. 2008. Automatic rice pest insects recognition based on BP neural network.  Journal of Beijing Normal University (Natural Science), 44,165–167. (in Chinese)
Zhu L Q, Zhang Z. 2010. Auto-classification of insect images based on color histogram and GLCM. In: 2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery. Yantai, China. pp. 2589–2593.
Zou X G, Ding W M. 2012. Design of processing system for agricultural pests with digital signal processor. Journal of Information Computational Science, 9, 4575–4582.
[1] LIU Ying-jie, ZHANG Dan-dan, YANG Li-yu, DONG Yong-hao, LIANG Ge-mei, Philip DONKERSLEY, REN Guang-wei, XU Peng-jun, WU Kong-ming . Analysis of phototactic responses in Spodoptera frugiperda using Helicoverpa armigera as control[J]. >Journal of Integrative Agriculture, 2021, 20(3): 821-828.
[2] QIAO Xi, LI Yan-zhou, SU Guang-yuan, TIAN Hong-kun, ZHANG Shuo, SUN Zhong-yu, YANG Long, WAN Fang-hao, QIAN Wan-qiang.
MmNet: Identifying Mikania micrantha Kunth in the wild via a deep Convolutional Neural Network
[J]. >Journal of Integrative Agriculture, 2020, 19(5): 1292-1300.
[3] TU Ke-ling, LI Lin-juan, YANG Li-ming, WANG Jian-hua, SUN Qun. Selection for high quality pepper seeds by machine vision and classifiers[J]. >Journal of Integrative Agriculture, 2018, 17(09): 1999-2006.
[4] YAO Qing, CHEN Guo-te, WANG Zheng, ZHANG Chao1 YANG Bao-jun, TANG Jian. Automated detection and identification of white-backed planthoppers in paddy fields using image processing[J]. >Journal of Integrative Agriculture, 2017, 16(07): 1547-1557.
No Suggested Reading articles found!