stMMR: accurate and robust spatial domain identification from spatially resolved transcriptomics with multimodal feature representation
- PMID: 39607984
- PMCID: PMC11604062
- DOI: 10.1093/gigascience/giae089
stMMR: accurate and robust spatial domain identification from spatially resolved transcriptomics with multimodal feature representation
Abstract
Background: Deciphering spatial domains using spatially resolved transcriptomics (SRT) is of great value for characterizing and understanding tissue architecture. However, the inherent heterogeneity and varying spatial resolutions present challenges in the joint analysis of multimodal SRT data.
Results: We introduce a multimodal geometric deep learning method, named stMMR, to effectively integrate gene expression, spatial location, and histological information for accurate identifying spatial domains from SRT data. stMMR uses graph convolutional networks and a self-attention module for deep embedding of features within unimodality and incorporates similarity contrastive learning for integrating features across modalities.
Conclusions: Comprehensive benchmark analysis on various types of spatial data shows superior performance of stMMR in multiple analyses, including spatial domain identification, pseudo-spatiotemporal analysis, and domain-specific gene discovery. In chicken heart development, stMMR reconstructed the spatiotemporal lineage structures, indicating an accurate developmental sequence. In breast cancer and lung cancer, stMMR clearly delineated the tumor microenvironment and identified marker genes associated with diagnosis and prognosis. Overall, stMMR is capable of effectively utilizing the multimodal information of various SRT data to explore and characterize tissue architectures of homeostasis, development, and tumor.
Keywords: domain identification; geometric deep learning; multimodal integration; similarity contrastive learning; spatially resolved transcriptomics.
© The Author(s) 2024. Published by Oxford University Press GigaScience.
Conflict of interest statement
The authors declare they have no competing interests.
Figures
References
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
