Quasi-reflection learning arithmetic optimization algorithm firefly search for feature selection
Status PubMed-not-MEDLINE Language English Country England, Great Britain Media electronic-ecollection
Document type Journal Article
PubMed
37101631
PubMed Central
PMC10123183
DOI
10.1016/j.heliyon.2023.e15378
PII: S2405-8440(23)02585-9
Knihovny.cz E-resources
- Keywords
- Aritmetic optimisation algorithm, Feature selection, Firefly algorithm, Metaheuristics, Quasi-reflection-based learning,
- Publication type
- Journal Article MeSH
With the whirlwind evolution of technology, the quantity of stored data within datasets is rapidly expanding. As a result, extracting crucial and relevant information from said datasets is a gruelling task. Feature selection is a critical preprocessing task for machine learning to reduce the excess data in a set. This research presents a novel quasi-reflection learning arithmetic optimization algorithm - firefly search, an enhanced version of the original arithmetic optimization algorithm. Quasi-reflection learning mechanism was implemented for enhancement of population diversity, while firefly algorithm metaheuristics were used to improve the exploitation abilities of the original arithmetic optimization algorithm. The aim of this wrapper-based method is to tackle a specific classification problem by selecting an optimal feature subset. The proposed algorithm is tested and compared with various well-known methods on ten unconstrained benchmark functions, then on twenty-one standard datasets gathered from the University of California, Irvine Repository and Arizona State University. Additionally, the proposed approach is applied to the Corona disease dataset. The experimental results verify the improvements of the presented method and their statistical significance.
Department of Mathematics Faculty of Science Mansoura University Mansoura 35516 Egypt
Modern College of Business and Science AL Khuwair 133 Muscat Oman
See more in PubMed
Bezdan T., Zivkovic M., Tuba E., Strumberger I., Bacanin N., Tuba M. International Conference on Intelligent and Fuzzy Systems. Springer; 2020. Glioma brain tumor grade classification from MRI using convolutional neural networks designed by modified FA; pp. 955–963.
Zivkovic M., Bacanin N., Venkatachalam K., Nayyar A., Djordjevic A., Strumberger I., Al-Turjman F. COVID-19 cases prediction by using hybrid machine learning and beetle antennae search approach. Sustain. Cities Soc. 2021;66 PubMed PMC
Budimirovic N., Prabhu E., Antonijevic M., Zivkovic M., Bacanin N., Strumberger I., Venkatachalam K. COVID-19 severity prediction using enhanced whale with salp swarm feature classification. Comput. Mater. Continua. 2022;72(1):1685–1698. doi: 10.32604/cmc.2022.023418. DOI
Bacanin N., Budimirovic N., Venkatachalam K., Strumberger I., Alrasheedi A.F., Abouhawwash M. Novel chaotic oppositional fruit fly optimization algorithm for feature selection applied on COVID 19 patients' health prediction. PLoS ONE. 2022;17(10) doi: 10.1371/journal.pone.0275727. PubMed DOI PMC
Cao B., Zhang W., Wang X., Zhao J., Gu Y., Zhang . Y. A memetic algorithm based on two Arch2 for multi-depot heterogeneous-vehicle capacitated arc routing problem. Swarm Evol. Comput. 2021;63 doi: 10.1016/j.swevo.2021.100864. DOI
Li R., Wu X., Tian H., Yu N., Wang C. Hybrid memetic pretrained factor analysis-based deep belief networks for transient electromagnetic inversion. IEEE Trans. Geosci. Remote Sens. 2022;60 doi: 10.1109/TGRS.2022.3208465. DOI
Cao B., Gu Y., Lv Z., Yang S., Zhao J., Li . Y. RFID reader anticollision based on distributed parallel particle swarm optimization. IEEE Int. Things J. 2021;8(5):3099–3107. doi: 10.1109/JIOT.2020.3033473. DOI
Bacanin N., Bezdan T., Tuba E., Strumberger I., Tuba M. Monarch butterfly optimization based convolutional neural network design. Mathematics. 2020;8(6):936.
Tian J., Hou M., Bian H., Li J. Variable surrogate model-based particle swarm optimization for high-dimensional expensive problems. Complex Intell. Syst. 2022 doi: 10.1007/s40747-022-00910-7. DOI
Wolpert D.H., Macready W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997;1:67–82.
Abualigah L., Diabat A., Mirjalili S., Elaziz M.A., Gandomi A.H. The arithmetic optimization algorithm. Comput. Methods Appl. Mech. Eng. 2021;376 doi: 10.1016/j.cma.2020.113609. DOI
Fan Q., Chen Z., Xia Z. A novel quasi-reflected Harris hawks optimization algorithm for global optimization problems. Soft Comput. 2020:1–19.
Yang X. Springer Berlin Heidelberg; 2009. Firefly Algorithms for Multimodal Optimization. Stochastic Algorithms: Foundations and Applications; pp. 168–178.
Mirjalili S., Gandomi A.H., Mirjalili S.Z., Saremi S., Faris H., Mirjalili S.M. Salp swarm algorithm: a bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017;114:163–191.
Kennedy J., Eberhart R. Particle swarm optimization. Proc. IEEE Int. Conf. Neural Netw. 2002;4:1942–1948.
Simon D. Biogeography-based optimization. IEEE Trans. Evol. Comput. 2008;12:702–713.
Wang G.G., Deb S., Gao X.Z., Coelho L.D.S. A new metaheuristic optimisation algorithm motivated by elephant herding behaviour. Int. J. Bio-Inspir. Comput. 2016;8:394–409.
Muthusamy H., Ravindran S., Yaacob S., Polat K. An improved elephant herding optimization using sine–cosine mechanism and opposition based learning for global optimization problems. Expert Syst. Appl. 2021;172
Pan W.T. Conference of Digital Technology and Innovation Management. 2011. A new evolutionary computation approach: fruit fly optimization algorithm; pp. 382–391.
Mirjalili S., Lewis A. The whale optimization algorithm. Adv. Eng. Softw. 2016;95:51–67.
Mirjalili S. Moth-flame optimization algorithm: a novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015;89:228–249.
Mirjalili S. SCA: a sine cosine algorithm for solving optimization problems. Knowl.-Based Syst. 2016;96:120–133.
Saremi S., Mirjalili S., Lewis A. Grasshopper optimization algorithm: theory and application. Adv. Eng. Softw. 2017;105:30–47.
Price K., Awad N., Ali M., Suganthan P. Nanyang Technological University; 2018. Problem definitions and evaluation criteria for the 100-digit challenge special session and competition on single objective numerical optimization. Technical Report.
Derrac J., García S., Molina D., Herrera F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2011;1(1):3–18. doi: 10.1016/j.swevo.2011.02.002. DOI
Kennedy J., Eberhart R.C. Computational Cybernetics and Simulation IEEE International Conference on Systems, Man, and Cybernetics. vol. 5. 1997. A discrete binary version of the particle swarm algorithm; pp. 4104–4108.
Thom de Souza R.C., de Macedo C.A., dos Santos Coelho L., et al. Binary coyote optimization algorithm for feature selection. Pattern Recognit. 2020;107 doi: 10.1016/j.patcog.2020.107470. DOI
Mirjalili S. Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput. Appl. 2016;27:1053–1073. doi: 10.1007/s00521-015-1920-1. DOI
Jingwei T., Mirjalili S. Hyper learning binary dragonfly algorithm for feature selection: a COVID-19 case study. Knowl.-Based Syst. 2021;215 doi: 10.1016/j.knosys.2020.106553. DOI
Sayed G.I., Hassanien A.E., Azar A.T. Feature selection via a novel chaotic crow search algorithm. Neural Comput. Appl. 2017:1–18.
Al-Madi N., Faris H., Mirjalili S. Binary multi-verse optimization algorithm for global optimization and discrete problems. Int. J. Mach. Learn. Cybern. 2019
He Y., Xie H., Wong T.L., Wang X. A novel binary artificial bee colony algorithm for the set-union knapsack problem. Future Gener. Comput. Syst. 2018;78:77–86. doi: 10.1016/j.future.2017.05.044. DOI
Tanabe R., Fukunaga A.S. 2014 IEEE Congress on Evolutionary Computation. CEC; 2014. Improving the search performance of SHADE using linear population size reduction; pp. 1658–1665.
Hansen N., Kern S. Parallel Problem Solving from Nature - PPSN VIII. Springer; Berlin, Heidelberg: 2004. Evaluating the CMA evolution strategy on multimodal test functions; pp. 282–291.
UCI Machine Learning Repository https://archive.ics.uci.edu/ml/index.php
http://featureselection.asu.edu/datasets.php Datasets - Feature Selection @ ASU.
Mirjalili S., Lewis A. S-shaped versus V-shaped transfer functions for binary particle swarm optimization. Swarm Evol. Comput. 2013;9:1–14. doi: 10.1016/j.swevo.2012.09.002. DOI
Sahlol A.T., Yousri D., Ewees A.A., AlQaness M.A.A., Damasevicius Ro., Abd Elaziz M. COVID-19 image classification using deep features and fractional-order marine predators algorithm. Sci. Rep. 2020 PubMed PMC
Iwendi C., Bashir A.K., Peshkar A., et al. COVID-19 patient health prediction using boosted random forest algorithm. Front. Public Health. 2020;8 doi: 10.3389/fpubh.2020.00357. PubMed DOI PMC
T. Eftimov, P. Korošec, B.K. Seljak, Disadvantages of statistical comparison of stochastic optimization algorithms, in: Proceedings of the Bioinspired Optimization Methods and their Applications, BIOMA 2016, pp. 105–118.
García S., Molina D., Lozano M., Herrera F. A study on the use of non-parametric tests for analyzing the evolutionary algorithms' behaviour: a case study on the CEC'2005 special session on real parameter optimization. J. Heuristics. 2009;15:617–644.
Shapiro S.S., Francia R. An approximate analysis of variance test for normality. J. Am. Stat. Assoc. 1972;67:215–216.
LaTorre A., Molina D., Osaba E., Poyatos J., Del Ser J., Herrera F. A prescription of methodological guidelines for comparing bio-inspired optimization algorithms. Swarm Evol. Comput. 2021;67
Glass G.V. Testing homogeneity of variances. Am. Educ. Res. J. 1966;3:187–190.
Friedman M. The use of ranks to avoid the assumption of normality implicit in the analysis of variance. J. Am. Stat. Assoc. 1937;32:675–701.
Friedman M. A comparison of alternative tests of significance for the problem of m rankings. Ann. Math. Stat. 1940;11:86–92.
Iman R.L., Davenport J.M. Approximations of the critical region of the fbietkan statistic. Commun. Stat., Theory Methods. 1980;9:571–595.
Sheskin D.J. Chapman and Hall/CRC; 2020. Handbook of Parametric and Nonparametric Statistical Procedures.