Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 Aug:72:102094.
doi: 10.1016/j.media.2021.102094. Epub 2021 Apr 30.

Volumetric white matter tract segmentation with nested self-supervised learning using sequential pretext tasks

Affiliations

Volumetric white matter tract segmentation with nested self-supervised learning using sequential pretext tasks

Qi Lu et al. Med Image Anal. 2021 Aug.

Abstract

White matter (WM) tract segmentation based on diffusion magnetic resonance imaging (dMRI) provides an important tool for the analysis of brain development, function, and disease. Deep learning based methods of WM tract segmentation have been proposed, which greatly improve the accuracy of the segmentation. However, the training of the deep networks usually requires a large number of manual delineations of WM tracts, which can be especially difficult to obtain and unavailable in many scenarios. Therefore, in this work, we explore how to perform deep learning based WM tract segmentation when annotated training data is scarce. To this end, we seek to exploit the abundant unannotated dMRI data in the self-supervised learning framework. From the unannotated data, knowledge about image context can be learned with pretext tasks that do not require manual annotations. Specifically, a deep network can be pretrained for the pretext task, and the knowledge learned from the pretext task is then transferred to the subsequent WM tract segmentation task with only a small number of annotated scans via fine-tuning. We explore two designs of pretext tasks that are related to WM tracts. The first pretext task predicts the density map of fiber streamlines, which are representations of generic WM pathways, and the training data can be obtained automatically with tractography. The second pretext task learns to mimic the results of registration-based WM tract segmentation, which, although inaccurate, is more relevant to WM tract segmentation and provides a good target for learning context knowledge. Then, we combine the two pretext tasks and develop a nested self-supervised learning strategy. In the nested self-supervised learning strategy, the first pretext task provides initial knowledge for the second pretext task, and the knowledge learned from the second pretext task with the initial knowledge is transferred to the target WM tract segmentation task via fine-tuning. To evaluate the proposed method, experiments were performed on brain dMRI scans from the Human Connectome Project dataset with various experimental settings. The results show that the proposed method improves the performance of WM tract segmentation when tract annotations are scarce.

Keywords: Deep network; Scarce annotation; Self-supervised learning; White matter tract segmentation.

PubMed Disclaimer

Conflict of interest statement

Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Similar articles

Cited by

Publication types

LinkOut - more resources