Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Apr 27;18(4):e0285131.
doi: 10.1371/journal.pone.0285131. eCollection 2023.

Global-local least-squares support vector machine (GLocal-LS-SVM)

Affiliations

Global-local least-squares support vector machine (GLocal-LS-SVM)

Ahmed Youssef Ali Amer. PLoS One. .

Abstract

This study introduces the global-local least-squares support vector machine (GLocal-LS-SVM), a novel machine learning algorithm that combines the strengths of localised and global learning. GLocal-LS-SVM addresses the challenges associated with decentralised data sources, large datasets, and input-space-related issues. The algorithm is a double-layer learning approach that employs multiple local LS-SVM models in the first layer and one global LS-SVM model in the second layer. The key idea behind GLocal-LS-SVM is to extract the most informative data points, known as support vectors, from each local region in the input space. Local LS-SVM models are developed for each region to identify the most contributing data points with the highest support values. The local support vectors are then merged at the final layer to form a reduced training set used to train the global model. We evaluated the performance of GLocal-LS-SVM using both synthetic and real-world datasets. Our results demonstrate that GLocal-LS-SVM achieves comparable or superior classification performance compared to standard LS-SVM and state-of-the-art models. In addition, our experiments show that GLocal-LS-SVM outperforms standard LS-SVM in terms of computational efficiency. For instance, on a training dataset of 9, 000 instances, the average training time for GLocal-LS-SVM was only 2% of the time required to train the LS-SVM model while maintaining classification performance. In summary, the GLocal-LS-SVM algorithm offers a promising solution to address the challenges associated with decentralised data sources and large datasets while maintaining high classification performance. Furthermore, its computational efficiency makes it a valuable tool for practical applications in various domains.

PubMed Disclaimer

Conflict of interest statement

The authors have declared that no competing interests exist.

Figures

Fig 1
Fig 1
Fig 2
Fig 2
Fig 3
Fig 3
Fig 4
Fig 4
Fig 5
Fig 5

References

    1. Boser, B.E.; Guyon, I.M.; Vapnik, V.N. A training algorithm for optimal margin classifiers. In Proceedings of the Fifth Annual Workshop on Computational Learning Theory, Pittsburgh, PA, USA, 27–29 July 1992, 144–152.
    1. Suykens J.A.K.; Van Gestel T.; De Brabanter J.; De Moor B.; Vandewalle J. Least Squares Support Vector Machines; World Scientific Publishing Co.: Singapore, 2002.
    1. Graf, H.P.; Cosatto, E.; Bottou, L.; Durdanovic, I.; Vapnik, V. Parallel Support Vector Machines: The Cascade SVM. NIPS, 2004.
    1. Suykens J.A., De Brabanter J., Lukas L. and Vandewalle J. Weighted least squares support vector machines: robustness and sparse approximation. Neurocomputing, 48(1-4), 2002, 85–105. doi: 10.1016/S0925-2312(01)00644-0 - DOI
    1. Suykens, A.K.J.; Lukas, L.; Vandewalle, J. Sparse least squares Support Vector Machine classifiers. ESANN, 8th European Symposium on Artificial Neural Networks, Bruges, Belgium, 2000, 37–42.