Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Jun;4(6):624-635.
doi: 10.1038/s41551-020-0534-9. Epub 2020 Apr 6.

A mountable toilet system for personalized health monitoring via the analysis of excreta

Affiliations

A mountable toilet system for personalized health monitoring via the analysis of excreta

Seung-Min Park et al. Nat Biomed Eng. 2020 Jun.

Erratum in

Abstract

Technologies for the longitudinal monitoring of a person's health are poorly integrated with clinical workflows, and have rarely produced actionable biometric data for healthcare providers. Here, we describe easily deployable hardware and software for the long-term analysis of a user's excreta through data collection and models of human health. The 'smart' toilet, which is self-contained and operates autonomously by leveraging pressure and motion sensors, analyses the user's urine using a standard-of-care colorimetric assay that traces red-green-blue values from images of urinalysis strips, calculates the flow rate and volume of urine using computer vision as a uroflowmeter, and classifies stool according to the Bristol stool form scale using deep learning, with performance that is comparable to the performance of trained medical personnel. Each user of the toilet is identified through their fingerprint and the distinctive features of their anoderm, and the data are securely stored and analysed in an encrypted cloud server. The toilet may find uses in the screening, diagnosis and longitudinal monitoring of specific patient populations.

PubMed Disclaimer

Figures

Fig. 1 |
Fig. 1 |. Schematic of the toilet system.
A perspective view of a toilet with a mountable device for continuously measuring baselines of human excreta. The toilet system includes (1) a 10-parameter test-strip-based urinalysis with a retractable cartridge; (2) computer-vision uroflowmetry with two high-speed cameras (the blue dotted lines represent the FOV from each camera); (3) stool classification by deep learning (the blue dotted lines represent the FOV of the defecation monitoring camera); (4) defecation time measurement detected by a pressure sensor below the toilet seat (the red arrow represents the force applied to the pressure sensor); (5) two-biometric identifications, an analprint scan (the green box represents the template-matching algorithm) and a fingerprint scanner on the flush lever; and (6) the ability to transfer all data by wireless communication to a cloud-based health portal system. Right: photographs of the actual system mounted on a toilet.
Fig. 2 |
Fig. 2 |. Computer-vision urinalysis and uroflowmetry of the toilet system.
a, A systematic workflow (from left to right) showing the process through which the toilet system analyses data from urine strips. Urination is detected with the motion sensor. The strip is deployed and retracted using a unipolar stepper motor. Raw data are captured using a charged coupled device camera, with lighting conditions normalized by a white LED strip. b, The RGB-value kinetics of the protein pad was tested using a urine sample that was spiked with bovine serum albumin. NC stands for negative control without having protein in solution. c, The 3D representation of strip colour changes with titrated protein levels. DRY stands for the pad’s original colour without wetting. d, Schematic uroflowmetry implementation in the toilet system. Urination is captured by two high-speed cameras (the red and blue dotted lines represent the FOV of each camera). The background is subtracted from each camera frame to isolate the urine flow. The urination volume and flow rate are estimated from two corrected video frames. e, An overlay of two graphs—one from the standard uroflowmeter (solid blue line) and another from the computer-vision uroflowmeter, which is represented by red dots also showing the terminal dribbles that are not detected by the standard uroflowmeter. f, Using a custom-made algorithm, the starting point and the end point of a urination event were automatically extracted. As the video frame rate is fixed, there is a linear correlation between the number of frames and the actual time recorded (Pearson’s r = 0.96). Some outliers (the points indicated by the yellow oval) were due to the terminal dribbles that were not detected using the gold-standard method. The blue arrow represents the urination time extracted from data shown in e. Participant (P) number and age (years) are indicated. CV, computer vision. g, Total voided volumes through depth- and flow-rate correction of two synchronized cameras were obtained and correlated with the total voided urine volume measured using uroflowmetry (Pearson’s r = 0.92).
Fig. 3 |
Fig. 3 |. CNN for stool analysis.
a, Confusion matrices comparison between the colorectal stool classifications made by two surgeons. The values on the x and y axes indicate the BSFS. Two confusion matrices were generated by defining the classification of one surgeon as ‘ground truth’, as BSFS is based purely on visual assessment. To calculate the percentages for each class, each cell was normalized in the corresponding row. A classic metric for the multiclass classification, MCC (−1 for complete misclassification and 1 for perfect classification), and a measure of performance in multiclass classification, CEN (0 for perfect classification and 1 for complete misclassification), were calculated and are shown. b, t-Distributed stochastic neighbour embedding visualization of the last hidden-layer representations in the CNN for seven stool classes. This graph describes the CNN’s internal representation of seven stool classes by applying t-distributed stochastic neighbour embedding—a method for visualizing high-dimensional data—to the last hidden-layer representation in the CNN of the board-certified-surgeon-proven photographic test sets (2,362 images). The coloured point clouds represent the different stool categories, showing how the CNN clusters the stools. Inset: exemplary micrographs corresponding to various class points. c, A schematic of the stool taxonomy. BS1 and BS2 of BSFS are regarded as abnormally hard stools (denoted as constipation), whereas BS6 and BS7 are regarded as abnormal liquid/soft stools (denoted as diarrhoea). Other types—BS3, BS4 and BS5—are generally considered to be the most normal and modal stool forms in the healthy adult population (denoted as normal). d, CNN layout. A deep CNN was used as our classification technique. We retrained the final layer of the previously trained Inception v.3 model with new categories—toilet states and stool states. The training classes include ten toilet classes comprising the seven BSFS classes in addition to three other toilet classes (clean state, urine state and toilet-paper state). e, The performances of the CNN were compared with those of medical students. Five medical students produced a single prediction per image (red points). The averages and s.d. of the medical students for each task were represented by green points, lines and caps. When using a completely new stool image set, the CNN provided accuracy that was comparable with those of trained medical students. All AUCs were greater than 0.91. The gold standard for BSFS classification was provided by a board-certified general surgeon (subspecialty of coloproctology).
Fig. 4 |
Fig. 4 |. Defecation monitoring module of the toilet system.
a, A systematic workflow for defecation monitoring. As a user sits on the toilet for a defecation event, the pressure sensor below the toilet seat initiates the defecation monitoring camera. The camera records the toilet bowl until the end of the defecation event. The collected images are then fed into deep CNN layers for stool classification. b, When the pressure sensor installed on the toilet seat is triggered after the user sits down (t0), a video recording assisted by LED lighting is initiated. The video camera records the entire defecation process (from t0 to tf and an additional 30 s as the grace period). Fragmented frames are sent to a cloud system for the application of a machine learning algorithm for toilet state classification and stool classification. An exemplary graph from participant 1 is shown. The total seating time of the participant was 129 s. c, Acquired images are classified by the first custom CNN to determine a toilet state, which consists of clean, urine, stool and toilet paper. In the graph, the toilet starts with the clean state (from 0 to 4 s), changes to the stool state (from 4 s to 80 s) and finally changes to the toilet paper state (80 s and thereafter). In this case, the end of defecation was determined by the point at which participant 1 used toilet paper. d, If a toilet state is in the stool state, the images are then introduced to the second custom CNN to determine their BSFS. The graph indicates that the CNN prediction of BSFS is majorly BS4, representing normal stool. When the pressure sensor returns a null signal (tf), the fingerprint scanner installed on the tank flush identifies the user and the user information is annotated to the data. e, In addition to stool classification information, other information, such as total defecating time (either t2t1 or tft1) and first stool dropping time after the seating (t1t0), are obtained. Ten series of defecation events completed by participant 1 are shown. All of the stool BSFS classifications were benchmarked against both surgeons (bar graphs). f, Eleven participants (five females and six males, aged from 19 to 41) generated 55 defecation events. Associated information, such as first stool dropping times, defecation durations and total seating times, are shown.
Fig. 5 |
Fig. 5 |. Biometric identifications using the fingerprint and the anal creases (the distinctive features of anoderm, or analprint).
a, Installation of a fingerprint scanner in the flush lever. We performed 410 fingerprinting trials from 10 participants, and the associated histogram and the ROC curve are shown. The AUC was 0.95. b, Schematic of analprint analysis for biometric identification. Video frames (high-definition resolution) acquired by a camera were analysed and referenced to stored image sets. Three algorithms were used to compare those two sets of images—MSE measure, SSIM and CNN. Among 11 participants, two video clips of the anus per participant were acquired from 7 participants, whereas one video clip of the anus per participant was acquired from 4 participants. c, As an input, individual frames of the anus from participant 1 were used for identification purposes. Simple box plots—including medians, first quartiles and third quartiles, overlapped with data points (representing individual frame comparisons)—are shown for the MSE index and SSIM calculation. We performed 10,201 comparisons among video-extracted frames for each participant except for participant 4, for whom we performed 10,908 comparisons. Probability histograms with error bars (standard deviations from averaging individual frames) are shown for CNN. For each participant, 301 frames from the video were extracted and fed to the CNN. The averages of the CNN predications from 301 frames are shown. When combined, all three methods provide precise user recognition with a high accuracy.

Comment in

  • A smart toilet for personalized health monitoring.
    Wang XJ, Camilleri M. Wang XJ, et al. Nat Rev Gastroenterol Hepatol. 2020 Aug;17(8):453-454. doi: 10.1038/s41575-020-0320-x. Nat Rev Gastroenterol Hepatol. 2020. PMID: 32483351 No abstract available.
  • A personalized view of excreta.
    Brdjanovic D. Brdjanovic D. Nat Biomed Eng. 2020 Jun;4(6):581-582. doi: 10.1038/s41551-020-0575-0. Nat Biomed Eng. 2020. PMID: 32533122 No abstract available.

References

    1. Porche DJ Precision medicine initiative. Am. J. Mens Health 9, 177 (2015). - PubMed
    1. Ashley EA The precision medicine initiative: a new national effort. JAMA 313, 2119–2120 (2015). - PubMed
    1. Collins FS & Varmus H A new initiative on precision medicine. N. Engl. J. Med 372, 793–795 (2015). - PMC - PubMed
    1. What is precision medicine? Genetics Home Reference https://ghr.nlm.nih.gov/primer/precisionmedicine/definition (2020).
    1. Gambhir SS, Ge TJ, Vermesh O & Spitler R Toward achieving precision health. Sci. Transl. Med 10, eaao3612 (2018). - PMC - PubMed

Publication types