TY - GEN
T1 - Harmony Search Algorithms for Optimizing Extreme Learning Machines
AU - Al-Shamiri, Abobakr Khalil
AU - Sadollah, Ali
AU - Kim, Joong Hoon
N1 - Funding Information:
Acknowledgements This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korean government (MSIT) (No. 2019R1A2B5B03069810).
Publisher Copyright:
© 2021, The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
PY - 2021
Y1 - 2021
N2 - Extreme learning machine (ELM) is a non-iterative algorithm for training single-hidden layer feedforward neural network (SLFN). ELM has been shown to have good generalization performance and faster learning speed than conventional gradient-based learning algorithms. However, due to the random determination of the hidden neuron parameters (i.e., input weights and biases) ELM may require a large number of neurons in the hidden layer. In this paper, the original harmony search (HS) and its variants, namely, improved harmony search (IHS), global-best harmony search (GHS), and intelligent tuned harmony search (ITHS) are used to optimize the input weights and hidden biases of ELM. The output weights are analytically determined using the Moore–Penrose (MP) generalized inverse. The performance of the hybrid approaches is tested on several benchmark classification problems. The simulation results show that the integration of HS algorithms with ELM has obtained compact network architectures with good generalization performance.
AB - Extreme learning machine (ELM) is a non-iterative algorithm for training single-hidden layer feedforward neural network (SLFN). ELM has been shown to have good generalization performance and faster learning speed than conventional gradient-based learning algorithms. However, due to the random determination of the hidden neuron parameters (i.e., input weights and biases) ELM may require a large number of neurons in the hidden layer. In this paper, the original harmony search (HS) and its variants, namely, improved harmony search (IHS), global-best harmony search (GHS), and intelligent tuned harmony search (ITHS) are used to optimize the input weights and hidden biases of ELM. The output weights are analytically determined using the Moore–Penrose (MP) generalized inverse. The performance of the hybrid approaches is tested on several benchmark classification problems. The simulation results show that the integration of HS algorithms with ELM has obtained compact network architectures with good generalization performance.
KW - Classification
KW - Extreme Learning Machine
KW - Harmony Search
UR - http://www.scopus.com/inward/record.url?scp=85097102551&partnerID=8YFLogxK
U2 - 10.1007/978-981-15-8603-3_2
DO - 10.1007/978-981-15-8603-3_2
M3 - Conference contribution
AN - SCOPUS:85097102551
SN - 9789811586026
T3 - Advances in Intelligent Systems and Computing
SP - 11
EP - 20
BT - Proceedings of 6th International Conference on Harmony Search, Soft Computing and Applications - ICHSA 2020
A2 - Nigdeli, Sinan Melih
A2 - Bekdas, Gebrail
A2 - Kim, Joong Hoon
A2 - Yadav, Anupam
PB - Springer Science and Business Media Deutschland GmbH
T2 - 6th International Conference on Harmony Search, Soft Computing and Applications, ICHSA 2020
Y2 - 22 April 2020 through 24 April 2020
ER -