Journal of Applied Science and Engineering

Published by Tamkang University Press


Impact Factor



Jingzong YangThis email address is being protected from spambots. You need JavaScript enabled to view it.

School of Big Data, Baoshan University, BaoShan, Yunnan, 678000, P.R.China


Received: September 29, 2023
Accepted: March 4, 2024
Publication Date: May 22, 2024

 Copyright The Author(s). This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are cited.

Download Citation: ||  

As an emerging detection technology, acoustic active detection has attracted considerable attention in recent years due to its advantages such as low cost, non-destructiveness, and ease of signal collection. To address the challenges of fault identification in acoustic detection of pipeline blockages, this paper proposes a fault recognition method that combines Minimum Redundancy Maximum Relevance (MRMR) and Multi-Kernel Extreme Learning Machine (MKELM). Firstly, statistical analysis and multi-resolution wavelet transform are employed to extract the acoustic pulse response signals of the blockage, thus constructing mixed-domain features. Subsequently, the MRMR algorithm is utilized for secondary feature extraction to eliminate redundant features. Finally, the optimized features are input into the MKELM classification model for identification. Experimental results demonstrate that mixed-domain features are more effective in characterizing blockage states compared to single-domain features, while the MRMR algorithm significantly reduces feature redundancy. Additionally, when comparing the recognition performance of different classification models, MKELM achieves higher accuracy than single-kernel models and traditional methods.

Keywords: Pipeline; MRMR; MKELM; Pattern recognition

  1. [1] S. Datta and S. Sarkar, (2016) “A review on different pipeline fault detection methods" Journal of Loss Prevention in the Process Industries 41: 97–106. DOI:
  2. [2] J. Liu, H. Kang, W. Tao, H. Li, D. He, L. Ma, H. Tang, S. Wu, K. Yang, and X. Li, (2023) “A spatial distribution – Principal component analysis (SD-PCA) model to assess pollution of heavy metals in soil" Science of The Total Environment 859: 160112. DOI:
  3. [3] G. Lakshmi Priya Palla and A. Kumar Pani, (2023) “Independent component analysis application for fault detection in process industries: Literature review and an application case study for fault detection in multiphase flow systems" Measurement 209: 112504. DOI:
  4. [4] Y. Liu, L. Li, S. Zhao, and C. Zhou, (2022) “A reliability analysis method based on adaptive Kriging and partial least squares" Probabilistic Engineering Mechanics 70: 103342. DOI:
  5. [5] H. Peng, F. Long, and C. Ding, (2005) “Feature selection based on mutual information: Criteria of maxdependency, max-relevance, and min-redundancy" IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 27(8): 1226–1238. DOI: 10.1109/TPAMI.2005.159.
  6. [6] C. Ding and H. Peng, (2005) “Minimum redundancy feature selection from microarray gene expression data" Journal of bioinformatics and computational biology 3(02): 185–205.
  7. [7] X. Gu, J. Guo, L. Xiao, and C. Li, (2022) “Conditional mutual information-based feature selection algorithm for maximal relevance minimal redundancy" APPLIED INTELLIGENCE 52(2): 1436–1447. DOI: 10.1007/s10489-021-02412-4.
  8. [8] Z. Ren, G. Ren, and D. Wu, (2022) “Deep Learning Based Feature Selection Algorithm for Small Targets Based on mRMR" MICROMACHINES 13(10): DOI: 10.3390/mi13101765.
  9. [9] N. M. Ali, A. I. B. Farouk, S. I. Haruna, H. Alanazi, M. Adamu, and Y. E. Ibrahim, (2022) “Feature selection approach for failure mode detection of reinforced concrete bridge columns" CASE STUDIES IN CONSTRUCTION MATERIALS 17: DOI: 10.1016/j.cscm.2022.e01383.
  10. [10] J. Wang, S. Lu, S.-H. Wang, and Y.-D. Zhang, (2022) “A review on extreme learning machine" MULTIMEDIA TOOLS AND APPLICATIONS 81(29): 41611–41660. DOI: 10.1007/s11042-021-11007-7.
  11. [11] J. S. Manoharan, (2021) “Study of variants of extreme learning machine (ELM) brands and its performance measure on classification algorithm" Journal of Soft Computing Paradigm (JSCP) 3(02): 83–95.
  12. [12] G.-B. Huang, H. Zhou, X. Ding, and R. Zhang, (2012) “Extreme Learning Machine for Regression and Multiclass Classification" IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART BCYBERNETICS 42(2, SI): 513–529. DOI: 10.1109/TSMCB.2011.2168604.
  13. [13] F. R. Bach, (2008) “Consistency of the group Lasso and multiple kernel learning" JOURNAL OF MACHINE LEARNING RESEARCH 9: 1179–1225.
  14. [14] A. Rakotomamonjy, F. Bach, S. Canu, and Y. Grandvalet. “More efficiency in multiple kernel learning”. In: Proceedings of the 24th international conference on Machine learning. 2007, 775–782.
  15. [15] C. Cortes, M. Mohri, and A. Rostamizadeh. “Learning sequence kernels”. In: 2008 IEEE Workshop on Machine Learning for Signal Processing. IEEE. 2008, 2–8.
  16. [16] D. P. Lewis, T. Jebara, and W. S. Noble. “Nonstationary kernel combination”. In: Proceedings of the 23rd international conference on Machine learning. 2006, 553– 560.
  17. [17] C. Ong, A. Smola, and R. Williamson, (2005) “Learning the kernel with hyperkernels" JOURNAL OF MACHINE LEARNING RESEARCH 6: 1043–1071.
  18. [18] A. Rahimi-Vahed and A. H. Mirzaei, (2007) “A hybrid multi-objective shuffled frog-leaping algorithm for a mixed-model assembly line sequencing problem" COMPUTERS & INDUSTRIAL ENGINEERING 53(4): 642–666. DOI: 10.1016/j.cie.2007.06.007.
  19. [19] E. Elbeltagi, T. Hegazy, and D. Grierson, (2005) “Comparison among five evolutionary-based optimization algorithms" ADVANCED ENGINEERING INFORMATICS 19(1): 43–53. DOI: 10.1016/j.aei.2005.01.004.
  20. [20] A. M. Dalavi, P. J. Pawar, and T. P. Singh, (2016) “Tool path planning of hole-making operations in ejector plate of injection mould using modified shuffled frog leaping algorithm" Journal of Computational Design and Engineering 3(3): 266–273.
  21. [21] A. Rahimi-Vahed and A. H. Mirzaei, (2008) “Solving a bi-criteria permutation flow-shop problem using shuffled frog-leaping algorithm" SOFT COMPUTING 12(5): 435–452. DOI: 10.1007/s00500-007-0210-y.
  22. [22] N. Mahmoudi, H. Orouji, and E. Fallah-Mehdipour, (2016) “Integration of Shuffled Frog Leaping Algorithm and Support Vector Regression for Prediction of Water Quality Parameters" WATER RESOURCES MANAGEMENT 30(7): 2195–2211. DOI: 10.1007/s11269-016-1280-3.
  23. [23] X. Wen. Pattern recognition and condition monitoring. Science Press, 2007.
  25. [25] G.-B. Huang, (2014) “An Insight into Extreme Learning Machines: Random Neurons, Random Features and Kernels" COGNITIVE COMPUTATION 6(3, SI): 376– 390. DOI: 10.1007/s12559-014-9255-2.
  26. [26] M. Eusuff, K. Lansey, and F. Pasha, (2006) “Shuffled frog-leaping algorithm: a memetic meta-heuristic for discrete optimization" ENGINEERING OPTIMIZATION 38(2): 129–154. DOI: 10.1080/03052150500384759.
  27. [27] B. B. Maaroof, T. A. Rashid, J. M. Abdulla, B. A. Hassan, A. Alsadoon, M. Mohamadi, M. Khishe, and S. Mirjalili, (2022) “Current Studies and Applications of Shuffled Frog Leaping Algorithm: A Review" ARCHIVES OF COMPUTATIONAL METHODS IN ENGINEERING 29(5): 3459–3474. DOI: 10.1007/s11831-021-09707-2.



69th percentile
Powered by  Scopus

SCImago Journal & Country Rank

Enter your name and email below to receive latest published articles in Journal of Applied Science and Engineering.