Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Mar 2;13(1):3541.
doi: 10.1038/s41598-023-30661-5.

Decoding behavior from global cerebrovascular activity using neural networks

Affiliations

Decoding behavior from global cerebrovascular activity using neural networks

Béatrice Berthon et al. Sci Rep. .

Abstract

Functional Ultrasound (fUS) provides spatial and temporal frames of the vascular activity in the brain with high resolution and sensitivity in behaving animals. The large amount of resulting data is underused at present due to the lack of appropriate tools to visualize and interpret such signals. Here we show that neural networks can be trained to leverage the richness of information available in fUS datasets to reliably determine behavior, even from a single fUS 2D image after appropriate training. We illustrate the potential of this method with two examples: determining if a rat is moving or static and decoding the animal's sleep/wake state in a neutral environment. We further demonstrate that our method can be transferred to new recordings, possibly in other animals, without additional training, thereby paving the way for real-time decoding of brain activity based on fUS data. Finally, the learned weights of the network in the latent space were analyzed to extract the relative importance of input data to classify behavior, making this a powerful tool for neuroscientific research.

PubMed Disclaimer

Conflict of interest statement

MT is co-founder and shareholder of Iconeus, an international company commercializing ultrasound neuroimaging scanners. The other authors declare no competing interests.

Figures

Figure 1
Figure 1
State classification pipeline scenarios. (a) In a first scenario, neural networks can be trained to identify from pixels values in fUS images (2500 pixels, CBV and n∆CBV) the associated behavioral states (distinction between 2 locomotion states: moving and static or 4 sleep/wake states: REM sleep (REMS), non-REM sleep (NREMS), Active Wake (AW) and Quiet Wake (QW)). The classification accuracy was above 85% for fUS images that were not normalized by a common fUS baseline (CBV) and 80% for normalized data (n∆CBV). (b) In a second scenario, the same decoding tasks can be performed using a much reduced information content corresponding to anatomical ROI mean values (50–80 values), obtained through expert atlas registration, with an average accuracy dropping by 13% at most for n∆CBV frames, and up to 28% for CBV frames. (c) In a third scenario, the second approach allowed for the classification of unseen fUS frames on any new recording (including a new animal) using the trained model providing sufficient similarity on the test and training recording sections. The corresponding bar graphs shows average accuracy values reported here as the percentage of frames adequately classified, on average across the different instances of model/animal pairs, with corresponding error bars (SD) in each case. The classification accuracy remained high even though it dropped compared to intra-animal accuracy: 78% for movement decoding and above 72% for sleep/wake state decoding (except for QW identification) as shown on the bar graph. Results of the permutation test evaluating the significance of the prediction are indicated in each case as ** (all p-values p < 0.05) or * (more than half of p-values p < 0.05).
Figure 2
Figure 2
Examples of decoding within a given acquisition. ROI-based networks allowed for decoding the animal’s brain state within one recording with high performance, as shown by the temporal profiles on (a) (moving vs static) and (b) (sleep/wake states). Panel (a) shows the animal’s speed (top profile) and the corresponding classification labels (true state in green on middle profile). The dashed black lines on the middle profile corresponding to the networks’ prediction show excellent agreement with the true state. The bottom profile shows the networks prediction in terms of state (shaded areas are “true” states), uncertainty (black) and errors (red dots), mostly located at state transitions. Excellent agreement between the true and predicted state is also visible for the sleep/wake state decoding on panel b, except at state transitions.
Figure 3
Figure 3
Visualization of the decoding error and uncertainty in time and space. (A) The architecture of our networks with a hidden-layer of dimension 3 allows for a visualization of the network’s activation in the latent space in 3D, here in the case of sleep/wake state identification based on pixel values. fUS frames labelled with the same sleep/wake state are grouped together in this state, and errors (red circles) are located at boundaries between the 4 different clusters, suggesting that they occur mainly at state transitions. This is confirmed on (B), showing consistently higher prediction uncertainty values and error rates near state transitions for the binary locomotion and for most of the sleep/wake transitions in the state decoding tasks (n represents the number of such transitions available in the data). Uncertainty of the network rises from 1 to 2 s to a state transition, which is consistent with the delays of neurovascular coupling. For NREMS/AW and NREMS/QW the uncertainty and error rate peak after the actual transition and remain high several seconds afterwards, which may be attributed to the “progressive” nature of these transitions.
Figure 4
Figure 4
Importance to the classification of different anatomical ROIs for the detection of the locomotion state (animal static or moving). The network’s learnt weights allowed for the visualization of spatial maps of relative region importance to the classification. Values provided on the graphs, corresponding to the relative importance of ROIs in ROI-based classification, as provided by the Holdback Input Randomization method, are averaged across the three animals with data points shown in different black markers for each animal. The 2D-maps displaying the local relative importance at bregma = − 4.0 mm show, on all of the three recordings considered (three different animals), high importance of the dentate gyrus (DG) for identifying the moving state, and a high importance of the central region of the ventral thalamic nucleus region and posterior amygdala (PA) region for the static state, confirming the ROI-based analysis.
Figure 5
Figure 5
Importance to the classification of different anatomical ROIs for sleep/wake states across the brain. Neural networks were trained pixel-wise and ROI-wise on 9 different coronal planes spanning the brain between bregma -6.5 mm and bregma 2.5 mm. Relative importance of anatomical regions calculated across those 9 planes are presented here for each sleep/wake state. Corresponding pixel-wise importance maps confirm the ROI-wise findings, and bring additional local information. For example, the contribution for REMS of the azygos anterior cerebral artery (azac), which does not correspond to a single ROI in our method is only visible on the pixel-wise maps. In the case of AW, the dentate gyrus (DG) and whole Hippocampus (whole Hpc) to a lesser extent are clearly visible on the pixel-wise map. For NREMS, the caudate putamen (CPu), which stands out as the most important ROI is highlighted on the pixel-wise map, in particular its central area. The map for QW shows highly heterogeneous importance within regions of the Periaqueductal gray (PAG), somatosensory cortex (SCx) and superior colliculus (SC). This is in line with the largely different importance calculated for the ROIs of the PAG (dorsal and ventral) and of the Colliculus. The region of the piriform cortex also appears highlighted on the pixel-wise map.

Similar articles

Cited by

References

    1. Hubel DH, Wiesel TN. Receptive fields of single neurons in the cat’s striate cortex. J. Physiol. 1959;148:574–591. doi: 10.1113/jphysiol.1959.sp006308. - DOI - PMC - PubMed
    1. Mountcastle VB. Modality and topographic properties of single neurons of cat’s somatic sensory cortex. Neurophysiology. 1957;20:408–434. doi: 10.1152/jn.1957.20.4.408. - DOI - PubMed
    1. Reale RA, Imig TJ. Tonotopic organization in auditory cortex of the cat. 1980;291:265–291. - PubMed
    1. O’Keefe, J. & Dostrovsky, J. The hippocampus as a spatial map. Preliminary evidence from unit activity in the freely-moving rat. Brain Res.34, 171–175 (1971). - PubMed
    1. Taube, S., Muller, U. & Ranck, B. Head-direction cells recorded from the postsubiculum in freely moving Rats. I. Description and quantitative analysis. J. Neurosci.10, 420–435 (1990). - PMC - PubMed

Publication types