Implementing a Compression Technique on the Progressive Contextual Excitation Network for Smart Farming Applications
Jazyk angličtina Země Švýcarsko Médium electronic
Typ dokumentu časopisecké články
Grantová podpora
CTU-TAIWAN TECH-2022-01
Czech Technical University-National Taiwan University of Science and Technology Joint Research Program
PubMed
36560087
PubMed Central
PMC9781053
DOI
10.3390/s22249717
PII: s22249717
Knihovny.cz E-zdroje
- Klíčová slova
- deep learning, model compression, progressive contextual excitation, pruning filters,
- MeSH
- automatizace MeSH
- farmy MeSH
- fyzikální jevy MeSH
- komprese dat * MeSH
- zemědělství * MeSH
- Publikační typ
- časopisecké články MeSH
The utilization of computer vision in smart farming is becoming a trend in constructing an agricultural automation scheme. Deep learning (DL) is famous for the accurate approach to addressing the tasks in computer vision, such as object detection and image classification. The superiority of the deep learning model on the smart farming application, called Progressive Contextual Excitation Network (PCENet), has also been studied in our recent study to classify cocoa bean images. However, the assessment of the computational time on the PCENet model shows that the original model is only 0.101s or 9.9 FPS on the Jetson Nano as the edge platform. Therefore, this research demonstrates the compression technique to accelerate the PCENet model using pruning filters. From our experiment, we can accelerate the current model and achieve 16.7 FPS assessed in the Jetson Nano. Moreover, the accuracy of the compressed model can be maintained at 86.1%, while the original model is 86.8%. In addition, our approach is more accurate than ResNet18 as the state-of-the-art only reaches 82.7%. The assessment using the corn leaf disease dataset indicates that the compressed model can achieve an accuracy of 97.5%, while the accuracy of the original PCENet is 97.7%.
Zobrazit více v PubMed
Bai C.H., Prakosa S.W., Hsieh H.Y., Leu J.S., Fang W.H. Progressive Contextual Excitation for Smart Farming Application. In: Tsapatsoulis N., Panayides A., Theocharides T., Lanitis A., Pattichis C., Vento M., editors. Computer Analysis of Images and Patterns. CAIP 2021. Volume 13052. Springer; Cham, Switzerland: 2021. Lecture Notes in Computer Science. DOI
Ren S., He K., Girshick R., Sun J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 2017;39:1137–1149. doi: 10.1109/TPAMI.2016.2577031. PubMed DOI
Sari C.T., Gunduz-Demir C. Unsupervised Feature Extraction via Deep Learning for Histopathological Classification of Colon Tissue Images. IEEE Trans. Med Imaging. 2019;38:1139–1149. doi: 10.1109/TMI.2018.2879369. PubMed DOI
Mulyanto M., Faisal M., Prakosa S.W., Leu J.-S. Effectiveness of Focal Loss for Minority Classification in Network Intrusion Detection Systems. Symmetry. 2020;13:4. doi: 10.3390/sym13010004. DOI
Sun W., Wu T. Learning Layout and Style Reconfigurable GANs for Controllable Image Synthesis. IEEE Trans. Pattern Anal. Mach. Intell. 2022;44:5070–5087. doi: 10.1109/TPAMI.2021.3078577. PubMed DOI
Adhitya Y., Prakosa S.W., Köppen M., Leu J.-S. Feature Extraction for Cocoa Bean Digital Image Classification Prediction for Smart Farming Application. Agronomy. 2020;10:1642. doi: 10.3390/agronomy10111642. DOI
Li H., Kadav A., Durdanovic I., Samet H., Graf H.P. Pruning Filters for Efficient Convnets. [(accessed on 4 November 2022)];Comput. Res. Repos. 2016 Available online: https://arxiv.org/abs/1608.08710.
Dong X., Chen S., Pan S.J. Learning to prune deep neural networks via layer-wise optimal brain surgeon. arXiv. 20171705.07565v2
Howard A., Sandler M., Chu G., Chen L.-C., Chen B., Tan M., Wang W., Zhu Y., Pang R., Vasudevan V., et al. Searching for MobileNetV3; Proceedings of the 2019 IEEE/CVF International Conference on Computer Vision (ICCV); Seoul, Republic of Korea. 27 October–2 November 2019; pp. 1314–1324. DOI
Iandola F.N., Han S., Moskewicz M.W., Ashraf K., Dally W.J., Keutzer K. SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size. Comput. Vis. Pattern Recognit. 2016;9349:1–13.
Han S., Mao H., Dally W.J. Deep compression: Compressing deep neural networks with pruning trained quantization and huffman coding. Fiber. 2015;56:3–7.
Hinton G.E., Vinyals O., Dean J. Distilling the knowledge in a neural network. arXiv. 20151503.02531
Prakosa S.W., Leu J.-S., Chen Z.-H. Pattern Analysis and Applications (PAA) Volume 24. Springer; Berlin/Heidelberg, Germany: 2021. Improving the Accuracy of Pruned Network Using Knowledge Distillation; pp. 819–830.
Rezk N.G., Hemdan E.E.-D., Attia A.-F., El-Sayed A., El-Rashidy M.A. An efficient IoT based smart farming system using machine learning algorithms. Multimedia Tools Appl. 2020;80:773–797. doi: 10.1007/s11042-020-09740-6. DOI
Udendhran R., Balamurugan M. Towards secure deep learning architecture for smart farming-based applications. Complex Intell. Syst. 2020;7:659–666. doi: 10.1007/s40747-020-00225-5. DOI
Menshchikov A., Shadrin D., Prutyanov V., Lopatkin D., Sosnin S., Tsykunov E., Iakovlev E., Somov A. Real-Time Detection of Hogweed: UAV Platform Empowered by Deep Learning. IEEE Trans. Comput. 2021;70:1175–1188. doi: 10.1109/TC.2021.3059819. DOI
Joshi P., Das D., Udutalapally V., Pradhan M.K., Misra S. RiceBioS: Identification of Biotic Stress in Rice Crops Using Edge-as-a-Service. IEEE Sens. J. 2022;22:4616–4624. doi: 10.1109/JSEN.2022.3143950. DOI
Hughes D.P., Salathe M. An open access repository of images on plant health to enable the development of mobile disease diagnostics. arXiv. 20151511.08060
Yu H., Liu J., Chen C., Heidari A.A., Zhang Q., Chen H., Mafarja M., Turabieh H. Corn Leaf Diseases Diagnosis Based on K-Means Clustering and Deep Learning. IEEE Access. 2021;9:143824–143835. doi: 10.1109/ACCESS.2021.3120379. DOI
Zeng W., Li H., Hu G., Liang D. Lightweight dense-scale network (LDSNet) for corn leaf disease identification. Comput. Electron. Agric. 2022;197:106943. doi: 10.1016/j.compag.2022.106943. DOI