Training artificial neural networks using self-organizing migrating algorithm for skin segmentation
Jazyk angličtina Země Anglie, Velká Británie Médium electronic
Typ dokumentu časopisecké články
Grantová podpora
SGS No. SP2024/008
VSB-Technical University of Ostrava
CZ.10.03.01/00/22-003/0000048
European Union under the REFRESH-Research Excellence For Region Sustainability and High-tech Industries via the Operational Programme Just Transition
PubMed
39349534
PubMed Central
PMC11443081
DOI
10.1038/s41598-024-72884-0
PII: 10.1038/s41598-024-72884-0
Knihovny.cz E-zdroje
- Klíčová slova
- Artificial neural networks, Computer vision, Optimization algorithm, SOMA, Skin segmentation, Swarm intelligence,
- MeSH
- algoritmy * MeSH
- kůže * diagnostické zobrazování MeSH
- lidé MeSH
- neuronové sítě * MeSH
- počítačové zpracování obrazu metody MeSH
- Check Tag
- lidé MeSH
- Publikační typ
- časopisecké články MeSH
This study presents an application of the self-organizing migrating algorithm (SOMA) to train artificial neural networks for skin segmentation tasks. We compare the performance of SOMA with popular gradient-based optimization methods such as ADAM and SGDM, as well as with another evolutionary algorithm, differential evolution (DE). Experiments are conducted on the skin dataset, which consists of 245,057 samples with skin and non-skin labels. The results show that the neural network trained by SOMA achieves the highest accuracy (93.18%), outperforming ADAM (84.87%), SGDM (84.79%), and DE (91.32%). The visual evaluation also reveals the SOMA-trained neural network's accurate and reliable segmentation capabilities in most cases. These findings highlight the potential of incorporating evolutionary optimization algorithms like SOMA into the training process of artificial neural networks, significantly improving performance in image segmentation tasks.
Zobrazit více v PubMed
Chen, H., Geng, L., Zhao, H., Zhao, C. & Liu, A. Image recognition algorithm based on artificial intelligence. Neural Comput. Appl.10.1007/s00521-021-06058-8 (2022). PubMed
Smith, T. B., Vacca, R., Mantegazza, L. & Capua, I. Natural language processing and network analysis provide novel insights on policy and scientific discourse around sustainable development goals. Sci. Rep.11, 22427. 10.1038/s41598-021-01801-6 (2021). PubMed PMC
Bilal, A. et al. Bc-qnet: A quantum-infused elm model for breast cancer diagnosis. Comput. Biol. Med.175, 108483. 10.1016/j.compbiomed.2024.108483 (2024). PubMed
Bilal, A. et al. Breast cancer diagnosis using support vector machine optimized by improved quantum inspired grey wolf optimization. Sci. Rep.14, 10714. 10.1038/s41598-024-61322-w (2024). PubMed PMC
Khan, A. Q. et al. A novel fusion of genetic grey wolf optimization and kernel extreme learning machines for precise diabetic eye disease classification. PLoS ONE19, 1–45. 10.1371/journal.pone.0303094 (2024). PubMed PMC
Bilal, A., Liu, X., Shafiq, M., Ahmed, Z. & Long, H. Nimeq-sacnet: A novel self-attention precision medicine model for vision-threatening diabetic retinopathy using image data. Comput. Biol. Med.171, 108099. 10.1016/j.compbiomed.2024.108099 (2024). PubMed
Bilal, A. et al. Advanced ckd detection through optimized metaheuristic modeling in healthcare informatics. Sci. Rep.14, 12601. 10.1038/s41598-024-63292-5 (2024). PubMed PMC
Del Ser, J. et al. Bio-inspired computation: Where we stand and what’s next. Swarm Evol. Comput.48, 220–250. 10.1016/j.swevo.2019.04.008 (2019).
Bilal, A., Sun, G., Mazhar, S. & Imran, A. Improved grey wolf optimization-based feature selection and classification using cnn for diabetic retinopathy detection. In Evolutionary Computing and Mobile Sustainable Networks (eds Suma, V. et al.) 1–14 (Springer Singapore, Singapore, 2022). 10.1007/978-981-16-9605-3_1.
Bilal, A., Sun, G., Li, Y., Mazhar, S. & Latif, J. Lung nodules detection using grey wolf optimization by weighted filters and classification using cnn. J. Chin. Inst. Eng.45, 175–186. 10.1080/02533839.2021.2012525 (2022).
Zhang, J., Sun, G., Sun, Y., Dou, H. & Bilal, A. Hyper-parameter optimization by using the genetic algorithm for upper limb activities recognition based on neural networks. IEEE Sens. J.21, 1877–1884. 10.1109/JSEN.2020.3018629 (2021).
Zelinka, I. Soma-self-organizing migrating algorithm. Self-Organ. Migr. Algorithm Methodol. Implement.[SPACE]10.1007/978-3-319-28161-2_1 (2016).
Li, J., Dong, X., Ruan, S. & Shi, L. A parallel integrated learning technique of improved particle swarm optimization and bp neural network and its application. Sci. Rep.12, 19325. 10.1038/s41598-022-21463-2 (2022). PubMed PMC
Mahapatra, A. K., Panda, N. & Pattanayak, B. K. Hybrid pso (sgpso) with the incorporation of discretization operator for training rbf neural network and optimal feature selection. Arab. J. Sci. Eng.48, 9991–10019. 10.1007/s13369-022-07408-x (2023).
Waqas, U., Ahmed, M. F., Rashid, H. M. A. & Al-Atroush, M. E. Optimization of neural-network model using a meta-heuristic algorithm for the estimation of dynamic poisson’s ratio of selected rock types. Sci. Rep.13, 11089. 10.1038/s41598-023-38163-0 (2023). PubMed PMC
Agahian, S. & Akan, T. Battle royale optimizer for training multi-layer perceptron. Evol. Syst.13, 563–575. 10.1007/s12530-021-09401-5 (2022).
Chauhan, D., Yadav, A. & Neri, F. A multi-agent optimization algorithm and its application to training multilayer perceptron models. Evol. Syst.[SPACE]10.1007/s12530-023-09518-9 (2023).
Chatterjee, R., Mukherjee, R., Roy, P. K. & Pradhan, D. K. Chaotic oppositional-based whale optimization to train a feed forward neural network. Soft. Comput.26, 12421–12443. 10.1007/s00500-022-07141-5 (2022).
Ansari, A., Ahmad, I. S., Bakar, A. A. & Yaakub, M. R. A hybrid metaheuristic method in training artificial neural network for bankruptcy prediction. IEEE Access8, 176640–176650. 10.1109/ACCESS.2020.3026529 (2020).
Mahdi, Q. A. M. N. et al. Training learning weights of elman neural network using salp swarm optimization algorithm. Procedia Computer Science 225, 1974–1986, 10.1016/j.procs.2023.10.188 (2023). note27th International Conference on Knowledge Based and Intelligent Information and Engineering Sytems (KES 2023).
Khan, A. Q., Sun, G., Li, Y., Bilal, A. & Manan, M. A. Optimizing fully convolutional encoder-decoder network for segmentation of diabetic eye disease. Comput. Mater. Continua 77, 2481–2504, 10.32604/cmc.2023.043239 (2023).
Zelinka, I. & Jouni, L. SOMA–self-organizing migrating algorithm mendel. In 6th International Conference on Soft Computing, Brno, Czech Republic (2000).
Zelinka, I. SOMA–Self-Organizing Migrating Algorithm 167–217 (Springer Berlin Heidelberg, Berlin, Heidelberg, 2004).
Bhatt, R. & Dhall, A. Skin segmentation. UCI Mach. Learn. Repos., 10.24432/C5T30C (2012).
Storn, R. & Price, K. Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces. J. Global Optim.11, 341–359. 10.1023/A:1008202821328 (1997).
Qian, N. On the momentum term in gradient descent learning algorithms. Neural Netw.12, 145–151. 10.1016/S0893-6080(98)00116-6 (1999). PubMed
Kingma, D. P. & Ba, J. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.698010.48550/arXiv.1412.6980 (2014).
Zhang, Z., Song, Y. & Qi, H. Age progression/regression by conditional adversarial autoencoder. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 10.48550/arXiv.1702.08423 (organizationIEEE, 2017).
Derrac, J., García, S., Molina, D. & Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput.1, 3–18. 10.1016/j.swevo.2011.02.002 (2011).
Carrasco, J., García, S., Rueda, M., Das, S. & Herrera, F. Recent trends in the use of statistical tests for comparing swarm and evolutionary computing algorithms: Practical guidelines and a critical review. Swarm Evol. Comput.54, 100665. 10.1016/j.swevo.2020.100665 (2020).