Journal of Researches in Mechanics of Agricultural Machinery

Journal of Researches in Mechanics of Agricultural Machinery

Development of a deep learning model based on VGG16 network for classifying saffron plants and weeds by color images

Document Type : Original Article

Authors
1 Department of Water and Soil, Faculty of Agricultural Engineering, Shahrood University of Technology, Iran
2 Department of Agronomy and Plant Breeding, College of Agriculture, Shahrood University of Technology, Iran
Abstract
    Abstract
The use of technologies based on site-specific management that enable effective weed control while causing minimal damage to the main crop is of great importance. In this study, color images and a deep learning strategy based on convolutional neural networks were employed to distinguish saffron from two of its common weeds, namely Azmak and Khakshir. A total of 291 images from the three classes (saffron, Azmak, and Khakshir) were captured under completely natural field conditions. The images were resized to 150 × 150 pixels, and data augmentation was then applied to virtually increase the number of images. The proposed deep learning architecture consisted of an improved VGG16 model, with initial weights applied using the transfer learning approach. Based on the obtained results, the accuracy of the proposed model on the test data was 91%, with an error rate of 0.3759. For the three target classes—saffron, Azmak, and Khakshir—the F1-scores were 85%, 100%, and 86%, respectively. The results also showed that the model achieved a precision of 100% in distinguishing saffron and Azmak from the other classes. This study can be considered a foundation for the development of robotic systems aimed at site-specific weed management in saffron fields.
EXTENDED ABSTRACT
Introduction 
Precision agriculture technologies play a crucial role in optimizing crop management by enabling site-specific interventions, particularly in weed control. Traditional weed management methods often result in excessive herbicide use, environmental damage, and crop damage. Automated weed detection using deep learning offers a promising solution, accurately distinguishing between crops and weeds, thereby facilitating targeted removal. Saffron, a high-value crop, faces competition from invasive weeds such as Flixweed and Hoary Cress, which reduce yield and quality. This study leverages computer vision and deep learning to classify saffron and these two common weeds under natural field conditions. Convolutional Neural Networks (CNNs), were employed due to their proven effectiveness in image classification tasks. Transfer learning was applied to enhance model performance by utilizing pre-trained weights from ImageNet. The research aims to develop a robust classification model that can support precision agriculture tools, such as robotic weeders, by accurately identifying weeds while preserving the main crop. The success of this approach could significantly reduce herbicide use, lower production costs, and improve saffron yield through automated, site-specific weed management.  
Material and Methods 
A dataset of 291 field images of saffron, Flixweed, and Hoary Cress was collected under natural lighting and environmental conditions. Each image was resized to 150×150 pixels, and data augmentation techniques (e.g., rotation, flipping, scaling) were applied to expand the dataset and improve model generalization artificially. The study utilized the VGG16 CNN architecture, fine-tuned via transfer learning with ImageNet weights. The model was trained to classify the three plant categories, and its performance was evaluated using test data. Key metrics included accuracy, F1-score, and precision to assess classification effectiveness.  
Results and Discussion 
The proposed model achieved an overall accuracy of 91% on unseen test data, with a loss of 0.3759. The F1-scores for Saffron, Hoary Cress, and Flixweed were 85%, 100%, and 86%, respectively. Notably, the model demonstrated perfect precision (100%) in distinguishing saffron from Hoary Cress, indicating no false positives for these classes. Flixweed recognition was slightly less precise but still highly effective (86% F1-score). The high classification accuracy suggests that deep learning, combined with transfer learning, is a viable approach for weed detection in precision agriculture. The model's ability to differentiate saffron from invasive weeds under real-world conditions supports its potential integration into automated weeding systems. These results suggest that robotic weeders equipped with such AI models can selectively target weeds while minimizing crop damage, thereby reducing reliance on broad-spectrum herbicides.  
Conclusions 
This study demonstrates the effectiveness of deep learning in distinguishing saffron from Flixweed and Hoary Cress under natural field conditions. The improved VGG16 model achieved high accuracy (91%) and near-perfect precision for certain weed classes, validating its potential for real-world agricultural applications. The findings provide a foundation for developing AI-driven weed removal robots, which could enhance precision farming by enabling targeted, sustainable weed management. Future research should focus on optimizing models for real-time processing and integration with robotic systems. Expanding the dataset to include more weed species and varying environmental conditions could further improve robustness. Overall, this work contributes to advancing precision agriculture technologies, offering a scalable solution for automated weed control in saffron fields and similar high-value crops.
Author Contributions
S.I. Saedi: Software, Methodology, Modeling, Writing of the initial draft, Final review, and editing.
H.Makarian: Review and editing of the text, Research, Data collection, Validation.
Data Availability Statement
Not applicable.
Acknowledgements 
Not applicable.
Ethical Considerations
This section states ethical approval details (e.g., Ethics Committee, ethical code) and confirms adherence to ethical standards, including avoidance of data fabrication, falsification, plagiarism, and misconduct.
Conflict of Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper
Funding Statement
The author(s) received no specific funding for this research
 
Keywords

Subjects


Bah, M. D., Hafiane, A., & Canals, R. (2018). Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images. In Remote Sensing. 10 (11).. https://doi.org/10.3390/rs10111690
Chollet, F. (2017). Deep learning with Python. Manning Publications Co.
Dyrmann, M., Karstoft, H., & Midtiby, H. S. (2016). Plant species classification using deep convolutional neural network. Biosystems Engineering, 151, 72–80. https://doi.org/https://doi.org/10.1016/j.biosystemseng.2016.08.024
Feng, J., Zeng, L., & He, L. (2019). Apple fruit recognition algorithm based on multi-spectral dynamic image analysis. Sensors (Basel, Switzerland), 19(949). https://doi.org/10.3390/s19040949
G C, S., Zhang, Y., Koparan, C., Ahmed, M. R., Howatt, K., & Sun, X. (2022). Weed and crop species classification using computer vision and deep learning technologies in greenhouse conditions. Journal of Agriculture and Food Research, 9, 100325. https://doi.org/https://doi.org/10.1016/j.jafr.2022.100325
Gao, J., French, A. P., Pound, M. P., He, Y., Pridmore, T. P., & Pieters, J. G. (2020). Deep convolutional neural networks for image-based Convolvulus sepium detection in sugar beet fields. Plant Methods, 16(1), 29. https://doi.org/10.1186/s13007-020-00570-z
Gebbers, R., & Adamchuk, V. I. (2010). Precision agriculture and food security. Science, 327(5968), 828. https://doi.org/10.1126/science.1183899
Gené-Mola, J., Vilaplana, V., Rosell-Polo, J. R., Morros, J.-R., Ruiz-Hidalgo, J., & Gregorio, E. (2019). Multi-modal deep learning for Fuji apple detection using RGB-D cameras and their radiometric capabilities. Computers and Electronics in Agriculture, 162, 689-698. https://doi.org/10.1016/j.compag.2019.05.030
Hu, K., Coleman, G., Zeng, S., Wang, Z., & Walsh, M. (2020). Graph weeds net: A graph-based deep learning method for weed recognition. Computers and Electronics in Agriculture, 174, 105520. https://doi.org/https://doi.org/10.1016/j.compag.2020.105520
Juwono, F.H., Wong, W.K., Verma, S., Shekhawat, N., Lease, B.A., Apriono, C., (2023). Machine learning for weed–plant discrimination in agriculture 5.0: an in-depth review. Artif. Intell. Agric. 10, 13–25. https://doi.org/10.1016/j.aiia.2023.09.002.
Kamilaris, A., & Prenafeta-Boldú, F. X. (2018). Deep learning in agriculture: A survey. Computers and Electronics in Agriculture, 147, 70-90. https://doi.org/10.1016/j.compag.2018.02.016
Khosravi, H., Saedi, S. I., & Rezaei, M. (2021). Real-time recognition of on-branch olive ripening stages by a deep convolutional neural network. Scientia Horticulturae, 287, 110252. https://doi.org/10.1016/j.scienta.2021.110252
Liu, Y.-P., Yang, C.-H., Ling, H., Mabu, S., & Kuremoto, T. (2019). A visual system of citrus picking robot using convolutional neural networks. In Proceedings of the International Conference on Systems Informatics (pp. 344-349).
Nasiri, A., Taheri-Garavand, A., & Zhang, Y.-D. (2019). Image-based deep learning automated sorting of date fruit. Postharvest Biology and Technology, 153, 133-141. https://doi.org/10.1016/j.postharvbio.2018.12.001
Nevavuori, P., Narra, N., & Lipping, T. (2019). Crop yield prediction with deep convolutional neural networks. Computers and Electronics in Agriculture, 163, 104859. https://doi.org/https://doi.org/10.1016/j.compag.2019.104859
Peng, H., Huang, B., Shao, Y., Li, Z., Zhang, C., Chen, Y., & Xiong, J. (2018). General improved SSD model for picking object recognition of multiple fruits in natural environment. In Transactions of the Chinese Society of Agricultural Engineering. 155-162.
Ponce, J. M., Aquino, A., & Andújar, J. M. (2019). Olive-Fruit Variety Classification by Means of Image Processing and Convolutional Neural Networks. IEEE Access, 7, 147629–147641. https://doi.org/10.1109/ACCESS.2019.2947160
Ren, S., He, K., Girshick, R., & Sun, J. (2015). Faster R-CNN: Towards real-time object detection with region proposal networks. arXiv preprint arXiv:1506.01497.
Sa, I., Ge, Z., Dayoub, F., Upcroft, B., Perez, T., & McCool, C. (2016). DeepFruits: A fruit detection system using deep neural networks. Sensors, 16(8), E1222. https://doi.org/10.3390/s16081222
Saedi, S. I., (2023). Determining apple fruit harvest time using color images and deep learning. Iranian Journal of Researches in Mechanics of Agricultural Machinery. 12 (3).
Saedi, S.I., Alimardani, R., Mousazadeh, H., & Salehi, R. (2019). Development and evaluation of an energy and water efficient intensive cropping system. INMATEH - Agricultural Engineering, 58(1), 93-104.
Saedi, S.I., & Khosravi, H. (2020). A deep neural network approach towards real-time on-branch fruit recognition for precision horticulture. Expert Systems with Applications, 159, 113594.
Saedi, S. I., Rezaei, M., & Khosravi, H. (2024). Dual-path lightweight convolutional neural network for automatic sorting of olive fruit based on cultivar and maturity. Postharvest Biology and Technology, 216,113054. https://doi.org/https://doi.org/10.1016/j.postharvbio.2024.113054
Salim, F., Saeed, F., Basurra, S., Qasem, S.N., & Al-Hadhrami, T., (2023). DenseNet-201 and xception pre-trained deep learning models for fruit recognition. Electronic 12, 3132. https://doi.org/10.3390/electronics12143132.
Shao, Y., Guan, X., Xuan, G., Gao, F., Feng, W., Gao, G., Wang, Q., Huang, X., Li, J., (2023) . GTCBS-YOLOv5s: A lightweight model for weed species identification in paddy fields, Computers and Electronics in Agriculture, 215, 108461.https://doi.org/10.1016/j.compag.2023.108461.
Yu, J., Sharpe, S. M., Schumann, A. W., & Boyd, N. S. (2019). Deep learning for image-based weed detection in turfgrass. European Journal of Agronomy, 104, 78–84. https://doi.org/https://doi.org/10.1016/j.eja.2019.01.004
Wang, C., Tang, Y., Zou, X., Luo, L., & Chen, X. (2017). Recognition and matching of clustered mature litchi fruits using binocular charge-coupled device (CCD) color cameras. Sensors, 17(5), E1073. https://doi.org/10.3390/s17051073