Mohammad-Reza Feizi-Derakhsh1 and Estabraq Abdulredaa Kadhim This email address is being protected from spambots. You need JavaScript enabled to view it.2

1ComInSyS Lab, Department of Computer Engineering, University of Tabriz, Tabriz, Iran
2Computer Techniques Eng. Dept., Al-Esraa University College, Baghdad, Iraq


Received: February 16, 2022
Accepted: June 17, 2022
Publication Date: September 21, 2022

 Copyright The Author(s). This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are cited.

Download Citation: ||  


Feature selection is the process of reducing the number of variables for improving the classification model. The problem of feature selection can be broadly defined as an optimization problem. That is, finding a subset of input Features that results in the best model performance. feature selection is considered as a discrete binary problem. To have such binary vectors for the CS, a limit to the value of eggs (dimensions) must be applied by setting an upper bound and a lower bound .Then, the nests (solutions) are generated and updated in such way that the eggs can only accept the values between boundaries, so Binary Cuckoo Search (BCS) is the most effective and promising metaheuristic approach for this purpose. This approach proposes an improving BCS using a hybrid Chi-square–filter method and chaotic map for feature selection problems. Chi-square is employed for generating an initial solution problem and subsequently, it contributes to enhancing the quality of the final solution. Also, using the chaotic map (sinusoidal) to determine variable values of the step size (α) parameter via local search area. The proposed Chi-BCS is validated on several real-world datasets. The results of the experiments show that Chi-BCS has improved dimensionality reduction (76.69%) and classification accuracy (58.84%) when compared with other available methods like EBCS,ACO and FSFOA.

Keywords: Feature Selection, Binary Cuckoo Search, Dimension Reduction, Chaotic Map


  1. [1] H. Liu, Y. Liu, R. Zhang, D. Liu, and Z. Zhang, (2021) “Feature selection embedded cluster distribution position for characteristic analysis of multi-dimension poverty-stricken households in China" Journal of Applied Science and Engineering (Taiwan) 24(3): 289–297. DOI: 10.6180/jase.202106_24(3).0003.
  2. [2] M. Rejaul Islam Royel, M. Ajmanur Jaman, F. Al Masud, A. Ahmed, A. Muyeed, and K. Ahmed, (2021) “Machine learning and data mining methods in early detection of stomach cancer risk" Journal of Applied Science and Engineering (Taiwan) 24(1): 1–8. DOI: 10.6180/jase.202102_24(1).0001.
  3. [3] M. Awadallah, A. Hammouri, M. Al-Betar, M. Braik, and M. Elaziz, (2022) “Binary Horse herd optimization algorithm with crossover operators for feature selection" Computers in Biology and Medicine 141: DOI: 10.1016/j.compbiomed.2021.105152.
  4. [4] A. Halim, I. Ismail, and S. Das, (2021) “Performance assessment of the metaheuristic optimization algorithms: an exhaustive review" Artificial Intelligence Review 54(3): 2323–2409. DOI: 10.1007/s10462-020-09906-6.
  5. [5] J. Zhao, S. Liu, M. Zhou, X. Guo, and L. Qi, (2018) “Modified cuckoo search algorithm to solve economic power dispatch optimization problems" IEEE/CAA Journal of Automatica Sinica 5(4): 794–806. DOI: 10.1109/JAS.2018.7511138.
  6. [6] C. Zhi, J. Cui, and L. Zhu. “Multicast Routing Algorithms Based on Levy Flying Particle Swarm Optimization”. In: 1453. 1. cited By 2. 2020. DOI: 10.1088/1742-6596/1453/1/012005.
  7. [7] M. Awadallah, M. Al-Betar, A. Hammouri, and O. Alomari, (2020) “Binary JAYA Algorithm with Adaptive Mutation for Feature Selection" Arabian Journal for Science and Engineering 45(12): 10875–10890. DOI: 10.1007/s13369-020-04871-2.
  8. [8] S. Salesi and G. Cosma. “A novel extended binary cuckoo search algorithm for feature selection”. In: 2017-January. cited By 22. 2017, 6–12. DOI: 10.1109/ICKEA.2017.8169893.
  9. [9] M. Ghaemi and M.-R. Feizi-Derakhshi, (2016) “Feature selection using Forest Optimization Algorithm" Pattern Recognition 60: 121–129. DOI: 10.1016/j.patcog.2016.05.012.
  10. [10] R. Khurma, I. Aljarah, A. Sharieh, M. Elaziz, R. Damaševiˇcius, and T. Krilaviˇcius, (2022) “A Review of the Modification Strategies of the Nature Inspired Algorithms for Feature Selection Problem" Mathematics 10(3): DOI: 10.3390/math10030464.
  11. [11] Y. Saji and M. Barkatou, (2021) “A discrete bat algorithm based on Lévy flights for Euclidean traveling salesman problem" Expert Systems with Applications 172: DOI: 10.1016/j.eswa.2021.114639.
  12. [12] A. Khan, A. Khan, J. Bangash, F. Subhan, A. Khan, A. Khan, M. Uddin, and M. Mahmoud, (2021) “Cuckoo Search-based SVM (CS-SVM) Model for Real-Time Indoor Position Estimation in IoT Networks" Security and Communication Networks 2021: DOI: 10.1155/2021/6654926.
  13. [13] M. Aghdam, N. Ghasem-Aghaee, and M. Basiri, (2009) “Text feature selection using ant colony optimization" Expert Systems with Applications 36(3 PART2): 6843–6853. DOI: 10.1016/j.eswa.2008.08.022.
  14. [14] W. Zhu, G. Si, Y. Zhang, and J.Wang, (2013) “Neighborhood effective information ratio for hybrid feature subset evaluation and selection" Neurocomputing 99: 25–37. DOI: 10.1016/j.neucom.2012.04.024.
  15. [15] L. Liu, X. Liu, N. Wang, and P. Zou, (2018) “Modified cuckoo search algorithm with variational parameters and logistic map" Algorithms 11(3): 1–11. DOI: 10.3390/a11030030.
  16. [16] B. Venkatesh and J. Anuradha, (2019) “A review of Feature Selection and its methods" Cybernetics and Information Technologies 19(1): 3–26. DOI: 10.2478/CAIT-2019-0001.
  17. [17] M. Khan, T. Akram, M. Sharif, M. Alhaisoni, T. Saba, and N. Nawaz, (2021) “A probabilistic segmentation and entropy-rank correlation-based feature selection approach for the recognition of fruit diseases" Eurasip Journal on Image and Video Processing 2021(1): DOI: 10.1186/s13640-021-00558-2.
  18. [18] O. Tarkhaneh, T. Nguyen, and S. Mazaheri, (2021) “A novel wrapper-based feature subset selection method using modified binary differential evolution algorithm" Information Sciences 565: 278–305. DOI: 10.1016/j.ins.2021.02.061.
  19. [19] L. Wang, Y. Gao, J. Li, and X. Wang, (2021) “A Feature Selection Method by using Chaotic Cuckoo Search Optimization Algorithm with Elitist Preservation and Uniform Mutation for Data Classification" Discrete Dynamics in Nature and Society 2021: DOI: 10.1155/2021/7796696.
  20. [20] D. Dua and E. Taniskidou. UCI Machine Learning Repository; University of California, School of Information and Computer Science: Irvine, CA, USA. Accessed: 2021-05-05.
  21. [21] Kaggle repository. Accessed: 2021-05-30.
  22. [22] J. Alcalá-Fdez, A. Fernández, J. Luengo, J. Derrac, S. García, L. Sánchez, and F. Herrera, (2011) “KEEL data-mining software tool: Data set repository, integration of algorithms and experimental analysis framework" Journal of Multiple-Valued Logic and Soft Computing 17(2-3): 255–287.
  23. [23] J. Cadenas, M. Garrido, and R. Martínez, (2013) “Feature subset selection Filter-Wrapper based on low quality data" Expert Systems with Applications 40(16): 6241–6252. DOI: 10.1016/j.eswa.2013.05.051.
  24. [24] S. Mary and L. Jabasheela, (2018) “Ant colony optimization based feature selection and data classification for depression anxiety and stress" IIOAB J. 9(2): 121–128.
  25. [25] B. Xue, M. Zhang, and W. Browne, (2014) “Particle swarm optimisation for feature selection in classification: Novel initialisation and updating mechanisms" Applied Soft Computing Journal 18: 261–276. DOI: 10.1016/j.asoc.2013.09.018.


42nd percentile
Powered by  Scopus

SCImago Journal & Country Rank

Enter your name and email below to receive latest published articles in Journal of Applied Science and Engineering.