Multi-Armed Bandits in Brain-Computer Interfaces
- PMID: 35874164
- PMCID: PMC9298543
- DOI: 10.3389/fnhum.2022.931085
Multi-Armed Bandits in Brain-Computer Interfaces
Abstract
The multi-armed bandit (MAB) problem models a decision-maker that optimizes its actions based on current and acquired new knowledge to maximize its reward. This type of online decision is prominent in many procedures of Brain-Computer Interfaces (BCIs) and MAB has previously been used to investigate, e.g., what mental commands to use to optimize BCI performance. However, MAB optimization in the context of BCI is still relatively unexplored, even though it has the potential to improve BCI performance during both calibration and real-time implementation. Therefore, this review aims to further describe the fruitful area of MABs to the BCI community. The review includes a background on MAB problems and standard solution methods, and interpretations related to BCI systems. Moreover, it includes state-of-the-art concepts of MAB in BCI and suggestions for future research.
Keywords: Brain-Computer Interface (BCI); calibration; multi-armed bandit (MAB); real-time optimization; reinforcement learning.
Copyright © 2022 Heskebeck, Bergeling and Bernhardsson.
Conflict of interest statement
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
References
-
- Agrawal S., Goyal N. (2012). Analysis of thompson sampling for the multi-armed bandit problem, in Proceedings of the 25th Annual Conference on Learning Theory, Volume 23 of Proceedings of Machine Learning Research, eds Mannor S., Srebro N., Williamson R. C. (Edinburgh: ), 39.1–39.26.
-
- Auer P., Cesa-Bianchi N., Fischer P. (2002). Finite-time analysis of the multiarmed bandit problem. Mach. Learn. 47, 235–256. 10.1023/A:1013689704352 - DOI
-
- Besson L. (2018). SMPyBandits: An Open-Source Research Framework for Single and Multi-Players Multi-Arms Bandits (MAB) Algorithms in Python. Available online at: https://GitHub.com/SMPyBandits/SMPyBandits (accessed April 28, 2022).
Publication types
LinkOut - more resources
Full Text Sources
