A Hybrid Framework for Plant Leaf Region Segmentation: Comparative Analysis of Swarm Intelligence with Convolutional Neural Networks
DOI:
https://doi.org/10.52756/ijerr.2024.v42.008Keywords:
Clustering, image segmentation, k-means, swarm techniques, thresholding, tomato plant diseaseAbstract
Agriculture is important for the survival of humanity since about 70% of the world's population is engaged in agricultural pursuits to varying degrees. The previous and present methodology lacks ways to identify diseases in different crops within an agricultural environment. The crops most impacted by these illnesses are tomatoes, leading to a significant increase in the price of tomatoes. Controlling tomato illnesses is crucial for optimal development and production, making early diagnosis and disease detection vital. Early disease diagnosis and treatment in tomato plants can lead to improved yields. Tomato plants are susceptible to illnesses, which can significantly affect the quality, quantity, and productivity of the crop if not properly treated. Despite many unsuccessful efforts to utilize machine learning methods for detecting and classifying illnesses in tomato plants. This study introduces a comparative framework for segmenting tomato leaf regions using both conventional and swarm intelligence methodologies to determine the more effective approach. This framework enables the development of a tomato plant diagnosis system capable of analyzing various sorts of images, including both standard and custom image datasets. The obtained evaluation parameters of the proposed model in terms of average precision, recall, f-measure, error, and accuracy are 0.914, 0.915, 0.915, 1.55% and 98.45%, respectively. We performed a comparative examination of six scenarios: T-model, K-model, TPSO-model, KPSO-model, TGO-model, and KGO-model. The combination of K-means with the GO algorithm proved to be a powerful hybrid technique, with an accuracy of 96.45%. Comparatively, the T-model, K-model, TPSO-model, KPSO-model, and TGO-model attained accuracies of 85.73%, 86.61%, 87.54%, 88.64%, and 92.21% correspondingly.