A Fully Automated 3D CT U-Net Framework for Segmentation and Measurement of the Masseter Muscle, Innovatively Incorporating a Self-Supervised Algorithm to Effectively Reduce Sample Size: A Validation Study in East Asian Populations
- PMID: 40858739
- DOI: 10.1007/s00266-025-05066-6
A Fully Automated 3D CT U-Net Framework for Segmentation and Measurement of the Masseter Muscle, Innovatively Incorporating a Self-Supervised Algorithm to Effectively Reduce Sample Size: A Validation Study in East Asian Populations
Abstract
Objective: The segmentation and volume measurement of the masseter muscle play an important role in radiological evaluation. Manual segmentation is considered the gold standard, but it has limited efficiency. This study aims to develop and evaluate a U-Net-based coarse-to-fine learning framework for automated segmentation and volume measurement of the masseter muscle, providing baseline data on muscle characteristics in 840 healthy East Asian volunteers, while introducing a self-supervised algorithm to reduce the sample size required for deep learning.
Method: A database of 840 individuals (253 males, 587 females) with negative head CT scans was utilized. Following G. Power's sample size calculation, 15 cases were randomly chosen for clinical validation. Masseter segmentation was conducted manually in manual group and automatically in Auto-Seg group. The primary endpoint was the masseter muscle volume, while the secondary endpoints included morphological score and runtime, benchmarked against manual segmentation. Reliability tests and paired t tests analyzed intra- and inter-group differences. Additionally, automatic volumetric measurements and asymmetry, calculated as (L - R)/(L±R) × 100%, were evaluated, with the clinical parameter correlation analyzed via Pearson's correlation test.
Results: The volume accuracy of automatic segmentation matched that of manual delineation (P > 0.05), demonstrating equivalence. Manual segmentation's runtime (937.3 ± 95.9 s) significantly surpassed the algorithm's (<1 s, p < 0.001). Among 840 patients, masseter asymmetry was 4.6% ± 4.6%, with volumes of (35.5 ± 9.6) cm3 for adult males and (26.6 ± 7.5) cm3 for adult females.
Conclusion: The U-Net-based algorithm demonstrates high concordance with manual segmentation in delineating the masseter muscle, establishing it as a reliable and efficient tool for CT-based assessments in healthy East Asian populations.
Level of evidence ii: This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .
Keywords: Automatic segmentation; Convolutional neural networks (ConvNets); Masseter muscle; U-Net.
© 2025. Springer Science+Business Media, LLC, part of Springer Nature and International Society of Aesthetic Plastic Surgery.
Conflict of interest statement
Declarations. Conflict of interest: We declare that we have no financial or personal relationships with other individuals or organizations that could improperly influence our work. Furthermore, we have no professional or other personal interests in any product, service, and/or company that could be construed as influencing the position presented in or the review of the manuscript titled"A fully automated 3D CT U-Net framework for segmentation and measurement of the masseter muscle, innovatively incorporating a self-supervised algorithm to effectively reduce sample size: A validation study in East Asian populations." Ethical Approval: This study was approved by the Ethics Committee of The Ninth People’s Hospital Affiliated to Shanghai Jiao Tong University, with the ethical approval number: K23687. This study is a retrospective study, and informed consent is not required for this type of research.
Similar articles
-
Large-scale convolutional neural network for clinical target and multi-organ segmentation in gynecologic brachytherapy via multi-stage learning.Med Phys. 2025 Aug;52(8):e18067. doi: 10.1002/mp.18067. Med Phys. 2025. PMID: 40817425
-
Radiomics and Artificial Intelligence Study of Masseter Muscle Segmentation in Patients With Hemifacial Microsomia.J Craniofac Surg. 2023 Mar-Apr 01;34(2):809-812. doi: 10.1097/SCS.0000000000009105. Epub 2023 Jan 9. J Craniofac Surg. 2023. PMID: 36728424
-
Point-cloud segmentation with in-silico data augmentation for prostate cancer treatment.Med Phys. 2025 Jul;52(7):e17815. doi: 10.1002/mp.17815. Epub 2025 Apr 3. Med Phys. 2025. PMID: 40181592 Free PMC article.
-
Home treatment for mental health problems: a systematic review.Health Technol Assess. 2001;5(15):1-139. doi: 10.3310/hta5150. Health Technol Assess. 2001. PMID: 11532236
-
Pharmacological treatment of children with gastro-oesophageal reflux.Cochrane Database Syst Rev. 2014 Nov 24;2014(11):CD008550. doi: 10.1002/14651858.CD008550.pub2. Cochrane Database Syst Rev. 2014. Update in: Cochrane Database Syst Rev. 2023 Aug 22;8:CD008550. doi: 10.1002/14651858.CD008550.pub3. PMID: 25419906 Free PMC article. Updated.
References
-
- Seltzer SE, Wang AM. Modern imaging of the masseter muscle: Normal anatomy and pathosis on CT and MRI. Oral Surgery, Oral Medicine, Oral Pathology. 1987;63(5):622–9. - PubMed
-
- Lin CS, Liu LK, Lee WJ, Peng LN, Lin CP, Lee SY, Chen LK. Low masseter muscle mass is associated with frailty in community-dwelling older adults: I-Lan Longitudinal Aging Study. Exp Gerontol. 2022;163:111777. - PubMed
-
- Pan Y, Wang Y, Li G, Chen S, Xu T. Validity and reliability of masseter muscles segmentation from the transverse sections of Cone-Beam CT scans compared with MRI scans. Int J Comput Assist Radiol Surg. 2022;17(4):751–9. - PubMed
-
- Hartstone-Rose A, Perry J, Morrow CJ. Bite force estimation and the fiber architecture of felid masticatory muscles. Anatom Record Adv Integrative Anatomy Evolutionary Biol. 2012;295(8):1336–51.
-
- Chollet F. Xception: Deep Learning with depthwise separable convolutions. 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2017.
LinkOut - more resources
Full Text Sources
Research Materials