This is a preprint.
Population Transformer: Learning Population-level Representations of Neural Activity
- PMID: 38883237
- PMCID: PMC11177958
Population Transformer: Learning Population-level Representations of Neural Activity
Abstract
We present a self-supervised framework that learns population-level codes for arbitrary ensembles of neural recordings at scale. We address key challenges in scaling models with neural time-series data, namely, sparse and variable electrode distribution across subjects and datasets. The Population Transformer (PopT) stacks on top of pretrained temporal embeddings and enhances downstream decoding by enabling learned aggregation of multiple spatially-sparse data channels. The pretrained PopT lowers the amount of data required for downstream decoding experiments, while increasing accuracy, even on held-out subjects and tasks. Compared to end-to-end methods, this approach is computationally lightweight, while achieving similar or better decoding performance. We further show how our framework is generalizable to multiple time-series embeddings and neural data modalities. Beyond decoding, we interpret the pretrained and fine-tuned PopT models to show how they can be used to extract neuroscience insights from large amounts of data. We release our code as well as a pretrained PopT to enable off-the-shelf improvements in multi-channel intracranial data decoding and interpretability. Code is available at https://github.com/czlwang/PopulationTransformer.
Similar articles
-
Neural Data Transformer 2: Multi-context Pretraining for Neural Spiking Activity.bioRxiv [Preprint]. 2023 Sep 22:2023.09.18.558113. doi: 10.1101/2023.09.18.558113. bioRxiv. 2023. PMID: 37781630 Free PMC article. Preprint.
-
Self-supervised learning improves robustness of deep learning lung tumor segmentation models to CT imaging differences.Med Phys. 2025 Mar;52(3):1573-1588. doi: 10.1002/mp.17541. Epub 2024 Dec 5. Med Phys. 2025. PMID: 39636237
-
Large-scale benchmarking and boosting transfer learning for medical image analysis.Med Image Anal. 2025 May;102:103487. doi: 10.1016/j.media.2025.103487. Epub 2025 Feb 21. Med Image Anal. 2025. PMID: 40117988
-
AMMU: A survey of transformer-based biomedical pretrained language models.J Biomed Inform. 2022 Feb;126:103982. doi: 10.1016/j.jbi.2021.103982. Epub 2021 Dec 31. J Biomed Inform. 2022. PMID: 34974190 Review.
-
Knowledge graph embeddings in the biomedical domain: are they useful? A look at link prediction, rule learning, and downstream polypharmacy tasks.Bioinform Adv. 2024 Jul 17;4(1):vbae097. doi: 10.1093/bioadv/vbae097. eCollection 2024. Bioinform Adv. 2024. PMID: 39506988 Free PMC article. Review.
Publication types
Grants and funding
LinkOut - more resources
Full Text Sources