Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2024 Mar 6;7(1):61.
doi: 10.1038/s41746-024-01050-7.

Contextualizing remote fall risk: Video data capture and implementing ethical AI

Affiliations
Review

Contextualizing remote fall risk: Video data capture and implementing ethical AI

Jason Moore et al. NPJ Digit Med. .

Abstract

Wearable inertial measurement units (IMUs) are being used to quantify gait characteristics that are associated with increased fall risk, but the current limitation is the lack of contextual information that would clarify IMU data. Use of wearable video-based cameras would provide a comprehensive understanding of an individual's habitual fall risk, adding context to clarify abnormal IMU data. Generally, there is taboo when suggesting the use of wearable cameras to capture real-world video, clinical and patient apprehension due to ethical and privacy concerns. This perspective proposes that routine use of wearable cameras could be realized within digital medicine through AI-based computer vision models to obfuscate/blur/shade sensitive information while preserving helpful contextual information for a comprehensive patient assessment. Specifically, no person sees the raw video data to understand context, rather AI interprets the raw video data first to blur sensitive objects and uphold privacy. That may be more routinely achieved than one imagines as contemporary resources exist. Here, to showcase/display the potential an exemplar model is suggested via off-the-shelf methods to detect and blur sensitive objects (e.g., people) with an accuracy of 88%. Here, the benefit of the proposed approach includes a more comprehensive understanding of an individual's free-living fall risk (from free-living IMU-based gait) without compromising privacy. More generally, the video and AI approach could be used beyond fall risk to better inform habitual experiences and challenges across a range of clinical cohorts. Medicine is becoming more receptive to wearables as a helpful toolbox, camera-based devices should be plausible instruments.

PubMed Disclaimer

Conflict of interest statement

Author A.G. is Deputy Editor of npj Digital Medicine. The remaining authors declare no competing interests.

Figures

Fig. 1
Fig. 1. Video-based data capture could be gathered from any location via a wearable camera.
Typically, common wear locations include the chest or waist (1). However, alternative locations with more routinely worn wearables could include the wrist (watch) or face/head via glasses (1). A CV model implementing YoloV8 (2a) drawing upon a well-characterized and comprehensive ground truth learning dataset/database (2b) and necessary libraries (2c, 2d, and 2e) via a suitable analytical environment (2f). The images to the right detail how the raw/original data (top) is anonymized with only the latter being visible as an output i.e., red locks indicate what is analyzed and then deleted with a green lock indicating the remaining image available for viewing. (A wearable IMU to quantify gait is worn on the lower back, not shown.) The algorithm selectively anonymizes only specific privacy-conscious objects such as screens, people, and documents while leaving the remaining content unanonymised to allow a better understanding of the environment in edge cases where the frame must be manually investigated.
Fig. 2
Fig. 2
Video-based glasses showing visual perspective (captured context, in this example it is ascending indoor stairs) and IMU device along with a representation of both worn by a participant.
Fig. 3
Fig. 3
Flat-level terrain (left) and participant stair ascent (right).

References

    1. Morris R, Lord S, Bunce J, Burn D, Rochester L. Gait and cognition: mapping the global and discrete relationships in ageing and neurodegenerative disease. Neurosci. Biobehav. Rev. 2016;64:326–345. doi: 10.1016/j.neubiorev.2016.02.012. - DOI - PubMed
    1. Nouredanesh M, Godfrey A, Howcroft J, Lemaire ED, Tung J. Fall risk assessment in the wild: a critical examination of wearable sensor use in free-living conditions. Gait Posture. 2021;85:178–190. doi: 10.1016/j.gaitpost.2020.04.010. - DOI - PubMed
    1. Del Din S, Godfrey A, Galna B, Lord S, Rochester L. Free-living gait characteristics in ageing and Parkinson’s disease: impact of environment and ambulatory bout length. J. Neuroeng. Rehabil. 2016;13:1–12. - PMC - PubMed
    1. Del Din S, Godfrey A, Mazzà C, Lord S, Rochester L. Free‐living monitoring of Parkinson’s disease: lessons from the field. Mov. Disord. 2016;31:1293–1313. doi: 10.1002/mds.26718. - DOI - PubMed
    1. Smeaton AF, Lanagan J, Caulfield B. Combining wearable sensors for location-free monitoring of gait in older people. J. Ambient Intell. Smart Environ. 2012;4:335–346. doi: 10.3233/AIS-2012-0155. - DOI