Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2018 Aug;9(4):599-609.
doi: 10.1007/s13244-018-0620-7. Epub 2018 May 16.

A review of existing and potential computer user interfaces for modern radiology

Affiliations
Review

A review of existing and potential computer user interfaces for modern radiology

Antoine Iannessi et al. Insights Imaging. 2018 Aug.

Abstract

The digitalization of modern imaging has led radiologists to become very familiar with computers and their user interfaces (UI). New options for display and command offer expanded possibilities, but the mouse and keyboard remain the most commonly utilized, for usability reasons. In this work, we review and discuss different UI and their possible application in radiology. We consider two-dimensional and three-dimensional imaging displays in the context of interventional radiology, and discuss interest in touchscreens, kinetic sensors, eye detection, and augmented or virtual reality. We show that UI design specifically for radiologists is key for future use and adoption of such new interfaces. Next-generation UI must fulfil professional needs, while considering contextual constraints. TEACHING POINTS: • The mouse and keyboard remain the most utilized user interfaces for radiologists. • Touchscreen, holographic, kinetic sensors and eye tracking offer new possibilities for interaction. • 3D and 2D imaging require specific user interfaces. • Holographic display and augmented reality provide a third dimension to volume imaging. • Good usability is essential for adoption of new user interfaces by radiologists.

Keywords: Computed tomodensitometry; Computer user interface; Interventional radiology; Virtual reality; Volume rendering.

PubMed Disclaimer

Conflict of interest statement

Iannessi Antoine is co-founder of Therapixel SA, therapixel.com. Therapixel is a medical imaging company for custom user interface dedicated to surgeons.

Clatz Olivier is CEO and co-founder of Therapixel SA, therapixel.com. Therapixel is a medical imaging company for custom user interface dedicated to surgeons.

Maki Sugimoto is COO and co-founder of Holoeyes Inc., Holoeyes.jp. Holoeyes is a medical imaging company specialized in virtual reality and 3D imaging user interface.

Figures

Fig. 1
Fig. 1
The usability of a computer user interface (CUI) in radiology is evaluated by three indicators. The UI is designed to maximize the usability in a specified context of use. In medical imaging, the usage context can be defined as a user (the reader of the images) inside his environment
Fig. 2
Fig. 2
The human–machine interaction is constrained by human and contextual factors. Typically, the display is around 60 cm from the radiologist. At this distance, the field of view is around 50 cm (around 21 in.). Considering the maximum angular resolution of the eye, the display can have maximum pitch of 0.21 mm. This corresponds to a 3-megapixel screen. The human retina contains two types of photoreceptors, rods and cones. The cones are densely packed in a central yellow spot called the “macula” and provide maximum visual acuity. Visual examination of small detail involves focusing light from that detail onto the macula. Peripheral vision and rods are responsible for night vision and motion detection
Fig. 3
Fig. 3
History of common computer user interfaces. The most common user interface (UI) is the graphical UI (GUI) used by operating systems (OS) of popular personal computers in the 1980s. It is designed with a mouse input to point to icons, menus and windows. Recently, new OS with specific interfaces for touchscreens have emerged, known as natural user interfaces (NUI)
Fig. 4
Fig. 4
Potential use of manual and gaze input cascaded (MAGIC) pointing for diagnostic radiology. The radiologist is examining lung parenchyma. When he focuses on an anomaly, the eye tracking device automatically moves the pointer around the region of interest. A large amount of mouse movement is eliminated (dotted arrow), and is limited to fine pointing and zooming. This cascade follows the observer's examination, making the interaction more natural
Fig. 5
Fig. 5
Tactile version of the Anywhere viewer (Therapixel, France). Cloud computing allows powerful post-processing with online PACS
Fig. 6
Fig. 6
Potential use of Surface Studio® (Microsoft, Redmond, WA, USA). The screen is tactile (a). The stylet would be handy for direct measurement and annotation of the image (b). The wheel could be used to select functions such as Windows and for scrolling of images (c). A properly designed software interface could replace traditional mouse and computer workstations
Fig. 7
Fig. 7
Touchless image viewer for operating rooms, Fluid (Therapixel, Paris, France). The surgeon or the interventional radiologist interacts in sterile conditions with gloved hands. The viewer interface is redesigned without a pointer; the tools are selected with lateral movements
Fig. 8
Fig. 8
Current possibility for 3D volume rendering (VR). VR post-processed from MRI acquisition (a, b). Hyper-realistic cinematic VR processed from CT scan acquisition (c, d)
Fig. 9
Fig. 9
True 3D viewer (EchoPixel, Mountain View, CA, USA). This is the first holographic imaging viewer approved by the FDA as a tool for diagnosis as well as surgical planning. A stylet can interact with the displayed volume of the colon, providing an accurate three-dimensional representation of patient anatomy
Fig. 10
Fig. 10
Augmented reality using 3D medical images is employed for planning and guiding surgical procedures. Surgeons wear a head-mounted optical device to create augmented reality (a). They can interact with the volume of the patient’s liver during surgery. Spatial augmented reality obtained by projection of the volume rendering on the patient (b, c). This see-thru visualization helps in guiding surgery
Fig. 11
Fig. 11
Virtual reality headset with medical images. The user wears a head-mounted display that immerses him in the simulated environments. A hand device allows him to interact with the virtual objects (a). The user experiences a first-person view inside and interacts with the 3D volume of the medical images (a). The volume can be sliced as a CT scan in any reformatted axis (b)
Fig. 12
Fig. 12
User-centred design for diagnostic imaging viewer. To maximize usability, the design of the interface needs to integrate the user and environmental requirements and constraints. Automatization and CAD have to facilitate use in order to minimize time and effort for image analysis

References

    1. Berman S, Stern H. Sensors for gesture recognition systems. IEEE Trans Syst Man Cybern Part C Appl Rev. 2012;42(3):277–290. doi: 10.1109/TSMCC.2011.2161077. - DOI
    1. Iso (1998) {ISO 9241–11:1998 ergonomic requirements for office work with visual display terminals (VDTs) -- part 11: guidance on usability}. citeulike-article-id:3290754
    1. Krupinski EA. Human factors and human-computer considerations in Teleradiology and Telepathology. Healthcare (Basel) 2014;2(1):94–114. doi: 10.3390/healthcare2010094. - DOI - PMC - PubMed
    1. Norweck JT, Seibert JA, Andriole KP, Clunie DA, Curran BH, Flynn MJ, et al. ACR-AAPM-SIIM technical standard for electronic practice of medical imaging. J Digit Imaging. 2013;26(1):38–52. doi: 10.1007/s10278-012-9522-2. - DOI - PMC - PubMed
    1. Engelbart DC, English WK. A research center for augmenting human intellect. San Francisco: Paper presented at the Proceedings of the December 9–11, 1968, fall joint computer conference, part I; 1968.