Global-local least-squares support vector machine (GLocal-LS-SVM)
- PMID: 37104506
- PMCID: PMC10138269
- DOI: 10.1371/journal.pone.0285131
Global-local least-squares support vector machine (GLocal-LS-SVM)
Abstract
This study introduces the global-local least-squares support vector machine (GLocal-LS-SVM), a novel machine learning algorithm that combines the strengths of localised and global learning. GLocal-LS-SVM addresses the challenges associated with decentralised data sources, large datasets, and input-space-related issues. The algorithm is a double-layer learning approach that employs multiple local LS-SVM models in the first layer and one global LS-SVM model in the second layer. The key idea behind GLocal-LS-SVM is to extract the most informative data points, known as support vectors, from each local region in the input space. Local LS-SVM models are developed for each region to identify the most contributing data points with the highest support values. The local support vectors are then merged at the final layer to form a reduced training set used to train the global model. We evaluated the performance of GLocal-LS-SVM using both synthetic and real-world datasets. Our results demonstrate that GLocal-LS-SVM achieves comparable or superior classification performance compared to standard LS-SVM and state-of-the-art models. In addition, our experiments show that GLocal-LS-SVM outperforms standard LS-SVM in terms of computational efficiency. For instance, on a training dataset of 9, 000 instances, the average training time for GLocal-LS-SVM was only 2% of the time required to train the LS-SVM model while maintaining classification performance. In summary, the GLocal-LS-SVM algorithm offers a promising solution to address the challenges associated with decentralised data sources and large datasets while maintaining high classification performance. Furthermore, its computational efficiency makes it a valuable tool for practical applications in various domains.
Copyright: © 2023 Ahmed Youssef Ali Amer. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Conflict of interest statement
The authors have declared that no competing interests exist.
References
-
- Boser, B.E.; Guyon, I.M.; Vapnik, V.N. A training algorithm for optimal margin classifiers. In Proceedings of the Fifth Annual Workshop on Computational Learning Theory, Pittsburgh, PA, USA, 27–29 July 1992, 144–152.
-
- Suykens J.A.K.; Van Gestel T.; De Brabanter J.; De Moor B.; Vandewalle J. Least Squares Support Vector Machines; World Scientific Publishing Co.: Singapore, 2002.
-
- Graf, H.P.; Cosatto, E.; Bottou, L.; Durdanovic, I.; Vapnik, V. Parallel Support Vector Machines: The Cascade SVM. NIPS, 2004.
-
- Suykens J.A., De Brabanter J., Lukas L. and Vandewalle J. Weighted least squares support vector machines: robustness and sparse approximation. Neurocomputing, 48(1-4), 2002, 85–105. doi: 10.1016/S0925-2312(01)00644-0 - DOI
-
- Suykens, A.K.J.; Lukas, L.; Vandewalle, J. Sparse least squares Support Vector Machine classifiers. ESANN, 8th European Symposium on Artificial Neural Networks, Bruges, Belgium, 2000, 37–42.
MeSH terms
LinkOut - more resources
Full Text Sources