Identification of Plant Species by Deep Learning and Providing as A Mobile Application

Image processing techniques give highly successful results when used deep learning in classification studies. Applications benefit from this kind of work to make life easier. In this study, a mobile application is developed that takes photo of a plant and makes image processing on it to provide information about its name, the time to change the soil, the amount of sun light and nutrition it needs. The model is trained using the Convolutional Neural Networks, and dataset is successfully applied to the network. Currently, the application is capable to classify 43 different plants in mobile environment, and its classification capacity is planned to be expanded with new plant species as a future study. Up to 90% accuracy is reached in this study with the current version of the application.

Derin Öğrenme Kullanılarak Bitki Türlerinin Sınıflandırılması ve Mobil Uygulama Olarak Sunumu

Görüntü işleme teknikleri kullanılarak yapılan sınıflandırma çalışmalarında yüksek başarılar elde edilmektedir. Hayatı kolaylaştıracak uygulamalar bu tür çalışmalardan faydalanmaktadır. Bu çalışmada geliştirilen mobil uygulama bitkinin resmi çekilerek görüntü analizi yapılıp kullanıcıya bitki hakkında, ismi, toprak değişim süresi, ihtiyaç duyduğu güneş ve destek besin miktarı gibi faydalı bilgiler sunulmaktadır. Geliştirilen model Convolutional Neural Network ile eğitilmiş olup bu çalışmada derin öğrenme başarılı bir şekilde kullanılmıştır. Mobil ortamda 43 farklı bitki türünü tanıyabilen uygulama yeni bitki türlerinin eklenmesi ile kapsamının genişletilmesi planlanmıştır. Çalışma şu an gelinen konum itibarıyla %90’lara varan doğruluk değeri elde edilebilmektedir.

___

[1] M. Dyrmann, H. Karstoft, and H. S. Midtiby, “Plant species classification using deep convolutional neural network,” Biosyst. Eng., vol. 151, pp. 72–80, Nov. 2016, doi: 10.1016/j.biosystemseng.2016.08.024.

[2] P. Barré, B. C. Stöver, K. F. Müller, and V. Steinhage, “LeafNet: A computer vision system for automatic plant species identification,” Ecol. Inform., vol. 40, pp. 50–56, Jul. 2017, doi: 10.1016/j.ecoinf.2017.05.005.

[3] G. L. Grinblat, L. C. Uzal, M. G. Larese, and P. M. Granitto, “Deep learning for plant identification using vein morphological patterns,” Comput. Electron. Agric., vol. 127, pp. 418– 424, Sep. 2016, doi: 10.1016/j.compag.2016.07.003.

[4] J. W. Tan, S.-W. Chang, S. Binti Abdul Kareem, H. J. Yap, and K.-T. Yong, “Deep Learning for Plant Species Classification using Leaf Vein Morphometric,” IEEE/ACM Trans. Comput. Biol. Bioinforma., pp. 1–1, 2018, doi: 10.1109/TCBB.2018.2848653.

[5] A. Ambarwari, Q. J. Adrian, Y. Herdiyeni, and I. Hermadi, “Plant species identification based on leaf venation features using SVM,” TELKOMNIKA (Telecommunication Comput. Electron. Control., vol. 18, no. 2, p. 726, Apr. 2020, doi: 10.12928/telkomnika.v18i2.14062.

[6] H. F. Eid and A. Abraham, “Plant species identification using leaf biometrics and swarm optimization: A hybrid PSO, GWO, SVM model,” Int. J. Hybrid Intell. Syst., vol. 14, no. 3, pp. 155–165, Mar. 2018, doi: 10.3233/HIS-180248.

[7] M. A. Islama, S. I. Yousuf, and M. M. Billah, “Automatic Plant Detection Using HOG and LBP Features With SVM,” Int. J. Comput., vol. 33, no. 1, pp. 26–38, 2019.

[8] I. Gogul and V. S. Kumar, “Flower species recognition system using convolution neural networks and transfer learning,” in 2017 Fourth International Conference on Signal Processing, Communication and Networking (ICSCN), 2017, pp. 1–6, doi: 10.1109/ICSCN.2017.8085675.

[9] M. Toğaçar, B. Ergen, and Z. Cömert, “Classification of flower species by using features extracted from the intersection of feature selection methods in convolutional neural network models,” Measurement, vol. 158, p. 107703, Jul. 2020, doi: 10.1016/j.measurement.2020.107703.

[10] M. Momeny, A. Jahanbakhshi, K. Jafarnezhad, and Y.-D. Zhang, “Accurate classification of cherry fruit using deep CNN based on hybrid pooling approach,” Postharvest Biol. Technol., vol. 166, p. 111204, Aug. 2020, doi: 10.1016/j.postharvbio.2020.111204.

[11] Y. Ren, N. Wang, M. Li, and Z. Xu, “Deep density-based image clustering,” Knowledge-Based Syst., vol. 197, p. 105841, Jun. 2020, doi: 10.1016/j.knosys.2020.105841.

[12] W. Qian et al., “UAV and a deep convolutional neural network for monitoring invasive alien plants in the wild,” Comput. Electron. Agric., vol. 174, p. 105519, Jul. 2020, doi: 10.1016/j.compag.2020.105519.

[13] K. P. Ferentinos, “Deep learning models for plant disease detection and diagnosis,” Comput. Electron. Agric., vol. 145, pp. 311–318, Feb. 2018, doi: 10.1016/j.compag.2018.01.009.

[14] Y. Osako, H. Yamane, S.-Y. Lin, P.-A. Chen, and R. Tao, “Cultivar discrimination of litchi fruit images using deep learning,” Sci. Hortic. (Amsterdam)., vol. 269, p. 109360, Jul. 2020, doi: 10.1016/j.scienta.2020.109360.

[15] J. Chen, J. Chen, D. Zhang, Y. Sun, and Y. A. Nanehkaran, “Using deep transfer learning for image-based plant disease identification,” Comput. Electron. Agric., vol. 173, p. 105393, Jun. 2020, doi: 10.1016/j.compag.2020.105393.

[16] S. Fan et al., “On line detection of defective apples using computer vision system combined with deep learning methods,” J. Food Eng., vol. 286, p. 110102, Dec. 2020, doi: 10.1016/j.jfoodeng.2020.110102.

[17] C. W. Yohannese and T. Li, “A Combined-Learning Based Framework for Improved Software Fault Prediction,” Int. J. Comput. Intell. Syst., vol. 10, no. 1, p. 647, 2017, doi: 10.2991/ijcis.2017.10.1.43.

[18] A. Krishnaswamy Rangarajan and R. Purushothaman, “Disease Classification in Eggplant Using Pre-trained VGG16 and MSVM,” Sci. Rep., vol. 10, no. 1, p. 2322, Dec. 2020, doi: 10.1038/s41598-020-59108-x.

[19] M. F. Adak, “Modeling of Irrigation Process Using Fuzzy Logic for Combating Drought,” Acad. Perspect. Procedia, vol. 2, no. 2, pp. 229–233, Oct. 2019, doi: 10.33793/acperpro.02.02.34.

[20] Y. LeCun, K. Kavukcuoglu, and C. Farabet, “Convolutional networks and applications in vision,” in Proceedings of 2010 IEEE International Symposium on Circuits and Systems, 2010, pp. 253–256, doi: 10.1109/ISCAS.2010.5537907.