Integrating Foundation Model Features into Graph Neural Network and Fusing Predictions with Standard Fine-Tuned Models for Histology Image Classification
- PMID: 41463629
- PMCID: PMC12729508
- DOI: 10.3390/bioengineering12121332
Integrating Foundation Model Features into Graph Neural Network and Fusing Predictions with Standard Fine-Tuned Models for Histology Image Classification
Abstract
Histopathological image classification using computational methods such as fine-tuned convolutional neural networks (CNNs) has gained significant attention in recent years. Graph neural networks (GNNs) have also emerged as strong alternatives, often employing CNNs or vision transformers (ViTs) as node feature extractors. However, as these models are usually pre-trained on small-scale natural image datasets, their performance in histopathology tasks can be limited. The introduction of foundation models trained on large-scale histopathological data now enables more effective feature extraction for GNNs. In this work, we integrate recently developed foundation models as feature extractors within a lightweight GNN and compare their performance with standard fine-tuned CNN and ViT models. Furthermore, we explore a prediction fusion approach that combines the outputs of the best-performing GNN and fine-tuned model to evaluate the benefits of complementary representations. Results demonstrate that GNNs utilizing foundation model features outperform those trained with CNN or ViT features and achieve performance comparable to standard fine-tuned CNN and ViT models. The highest overall performance is obtained with the proposed prediction fusion strategy. Evaluated on three publicly available datasets, the best fusion achieved F1-scores of 98.04%, 96.51%, and 98.28%, and balanced accuracies of 98.03%, 96.50%, and 97.50% on PanNuke, BACH, and BreakHis, respectively.
Keywords: computational pathology; deep learning; foundation model; graph neural network; image classification; medical image analysis.
Conflict of interest statement
The authors declare no conflicts of interest related to this work.
Figures
References
-
- Litjens G., Kooi T., Bejnordi B.E., Setio A.A.A., Ciompi F., Ghafoorian M., Van Der Laak J.A., Van Ginneken B., Sánchez C.I. A survey on deep learning in medical image analysis. Med. Image Anal. 2017;42:60–88. - PubMed
-
- Wang L., Shen L., Yi J., Yang X., Peng Y., Ding J., Tian Y., Yan S. Prediction model of dynamic fracture toughness of nickel-based alloys: Combination of data-driven and multi-scale modelling. Eur. J. Mech.-A/Solids. 2026;116:105892. doi: 10.1016/j.euromechsol.2025.105892. - DOI
-
- Mao Z., Suzuki S., Nabae H., Miyagawa S., Suzumori K., Maeda S. Machine learning-enhanced soft robotic system inspired by rectal functions to investigate fecal incontinence. Bio-Des. Manuf. 2025;8:482–494. doi: 10.1631/bdm.2400152. - DOI
Grants and funding
LinkOut - more resources
Full Text Sources
Miscellaneous
