Large-scale foundation models and generative AI for BigData neuroscience
- PMID: 38897235
- PMCID: PMC11649861
- DOI: 10.1016/j.neures.2024.06.003
Large-scale foundation models and generative AI for BigData neuroscience
Abstract
Recent advances in machine learning have led to revolutionary breakthroughs in computer games, image and natural language understanding, and scientific discovery. Foundation models and large-scale language models (LLMs) have recently achieved human-like intelligence thanks to BigData. With the help of self-supervised learning (SSL) and transfer learning, these models may potentially reshape the landscapes of neuroscience research and make a significant impact on the future. Here we present a mini-review on recent advances in foundation models and generative AI models as well as their applications in neuroscience, including natural language and speech, semantic memory, brain-machine interfaces (BMIs), and data augmentation. We argue that this paradigm-shift framework will open new avenues for many neuroscience research directions and discuss the accompanying challenges and opportunities.
Keywords: BigData; Brain-machine interface; Embedding; Foundation model; Generative AI; Representation learning; Self-supervised learning; Transfer learning; Transformer.
Copyright © 2024 The Authors. Published by Elsevier B.V. All rights reserved.
Conflict of interest statement
Declaration of Competing Interest The authors declare no competing interests.
References
-
- Agostinelli A, Denk TI, Borsos Z, Engel J, Verzetti M, Caillon A, Huang Q, Jansen A, Roberts A, Tagliasacchi M, et al. , 2023. MusicLM: Generating music from text. doi: 10.48550/arXiv.2301.11325. - DOI
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
