Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 May 31;22(11):4204.
doi: 10.3390/s22114204.

Multi-Swarm Algorithm for Extreme Learning Machine Optimization

Affiliations

Multi-Swarm Algorithm for Extreme Learning Machine Optimization

Nebojsa Bacanin et al. Sensors (Basel). .

Abstract

There are many machine learning approaches available and commonly used today, however, the extreme learning machine is appraised as one of the fastest and, additionally, relatively efficient models. Its main benefit is that it is very fast, which makes it suitable for integration within products that require models taking rapid decisions. Nevertheless, despite their large potential, they have not yet been exploited enough, according to the recent literature. Extreme learning machines still face several challenges that need to be addressed. The most significant downside is that the performance of the model heavily depends on the allocated weights and biases within the hidden layer. Finding its appropriate values for practical tasks represents an NP-hard continuous optimization challenge. Research proposed in this study focuses on determining optimal or near optimal weights and biases in the hidden layer for specific tasks. To address this task, a multi-swarm hybrid optimization approach has been proposed, based on three swarm intelligence meta-heuristics, namely the artificial bee colony, the firefly algorithm and the sine-cosine algorithm. The proposed method has been thoroughly validated on seven well-known classification benchmark datasets, and obtained results are compared to other already existing similar cutting-edge approaches from the recent literature. The simulation results point out that the suggested multi-swarm technique is capable to obtain better generalization performance than the rest of the approaches included in the comparative analysis in terms of accuracy, precision, recall, and f1-score indicators. Moreover, to prove that combining two algorithms is not as effective as joining three approaches, additional hybrids generated by pairing, each, two methods employed in the proposed multi-swarm approach, were also implemented and validated against four challenging datasets. The findings from these experiments also prove superior performance of the proposed multi-swarm algorithm. Sample code from devised ELM tuning framework is available on the GitHub.

Keywords: extreme learning machine; hybridization; machine learning; meta-heuristic algorithms; multi-swarm algorithm; swarm intelligence.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Overview for the proposed ELM-MS-AFS approach.
Figure 2
Figure 2
Distribution of classes in Diabetes, Disease, Iris, Wine, and Wine Quality datasets before split.
Figure 3
Figure 3
Distribution of classes in Satellite and Shuttle datasets with predetermined training and testing subsets.
Figure 4
Figure 4
Graphs for convergence speed evaluation on seven observed datasets for 30, 60, and 90 neurons, for the proposed method vs. other approaches.
Figure 4
Figure 4
Graphs for convergence speed evaluation on seven observed datasets for 30, 60, and 90 neurons, for the proposed method vs. other approaches.
Figure 5
Figure 5
Generated confusion matrices and PR curves for some datasets by ELM-MS-AFS.
Figure 6
Figure 6
Distribution of classes in NSL-KDD dataset with predetermined training and testing subsets.
Figure 7
Figure 7
Graphs for convergence speed evaluation on four observed datasets for 30, 60, and 90 neurons, for the proposed ELM-MS-AFS vs. other hybrid approaches.
Figure 8
Figure 8
Generated PR curves for NSL-KDD dataset by hybrid methods.

References

    1. Huang G.B., Zhu Q.Y., Siew C.K. Extreme learning machine: A new learning scheme of feedforward neural networks; Proceedings of the 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541); Budapest, Hungary. 25–29 July 2004; pp. 985–990. - DOI
    1. Alshamiri A.K., Singh A., Surampudi B.R. Two swarm intelligence approaches for tuning extreme learning machine. Int. J. Mach. Learn. Cybern. 2018;9:1271–1283. doi: 10.1007/s13042-017-0642-3. - DOI
    1. Wang J., Lu S., Wang S., Zhang Y.D. A review on extreme learning machine. Multimed. Tools Appl. 2021:1–50. doi: 10.1007/s11042-021-11007-7. - DOI
    1. Rong H.J., Ong Y.S., Tan A.H., Zhu Z. A fast pruned-extreme learning machine for classification problem. Neurocomputing. 2008;72:359–366. doi: 10.1016/j.neucom.2008.01.005. - DOI
    1. Zhu Q.Y., Qin A., Suganthan P., Huang G.B. Evolutionary extreme learning machine. Pattern Recognit. 2005;38:1759–1763. doi: 10.1016/j.patcog.2005.03.028. - DOI

LinkOut - more resources