Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2026 Mar 2;9(1):128.
doi: 10.1038/s41746-025-02332-4.

Economic evaluation of a digital symptom checker for endometriosis using a Markov decision process model

Affiliations

Economic evaluation of a digital symptom checker for endometriosis using a Markov decision process model

Yihan Xu et al. NPJ Digit Med. .

Abstract

Digital symptom checkers (SCs) are increasingly used to support early symptom recognition and care-seeking, yet evidence on their cost-effectiveness remains limited. We conducted an economic evaluation of a digital SC for endometriosis, a prevalent but underdiagnosed condition, as a case study. We developed a Markov decision process model to compare the digital SC with the standard of care from a societal perspective. Over a 40-year horizon, the digital SC reduced diagnostic delay by 4.36 years, generated 0.049 quality-adjusted life years (QALYs) per person, saved $5196.22 in costs, and produced an incremental net monetary benefit (INMB) of $10,089.00 at a $100,000/QALY threshold. Probabilistic sensitivity analysis confirmed the robustness of these findings, with an INMB of $12,398.92 (95% CI: $11,893.11-$12,904.72). Scenario analyses showed that the SC remained cost-effective under a wide range of assumptions, with the greatest value realized when sensitivity and specificity were ≥0.7, compliance exceeded 45%, and a time horizon of at least 10 years. This study provides the first economic evaluation of a digital SC for endometriosis and illustrates when and how digital SCs can deliver value to patients and health systems.

PubMed Disclaimer

Conflict of interest statement

Competing interests: Y.X., A.M., H.S., A.W., and A.K. are employees of Flo Health. C.P. and L.Z. are contractors for Flo Health. S.T.-R. and J.M. are external consultants for Flo Health. A.M., A.W., H.S., L.Z., and A.K. hold equity interests in Flo Health.

Figures

Fig. 1
Fig. 1. Deterministic sensitivity analysis results showing the impact of key parameter variations on the cost-effectiveness of the Flo Symptom checker versus standard of care at a willingness-to-pay threshold of $100,000 per QALY.
Parameters at the top of the chart have the greatest influence on model outcomes, while those at the bottom have a smaller effect. Only the top 15 parameters are displayed for clarity.
Fig. 2
Fig. 2. Impact of Flo Symptom Checker accuracy on incremental net monetary benefit (INMB) and diagnostic delay reduction: a two-way sensitivity analysis.
The left panel illustrates the effect on INMB at a willingness-to-pay threshold of $100,000 per QALY. The green zone represents combinations of accuracy thresholds where Flo SC remains cost-effective or reduces time to diagnosis, while the blue zone indicates scenarios where cost-effectiveness is not achieved or leads to further diagnostic delay. The right panel shows the effects on diagnostic delay reduction. Higher specificity is particularly critical for reducing diagnostic delay, as indicated by the green zone, while lower specificity results in minimal or even longer diagnostic delay.
Fig. 3
Fig. 3. Probabilistic sensitivity analysis results, visualized on an incremental cost-effectiveness ratio (ICER) plane for Flo SC versus standard care.
This scatter plot visualizes the results of 1000 Monte Carlo simulations, showing the incremental QALYs and incremental costs associated with Flo SC compared to standard care. The orange ellipse represents the 95% confidence interval of the ICER. The dotted green line represents the $100,000 per QALY willingness-to-pay (WTP) threshold. Most points fall in the fourth quadrant, indicating that Flo SC is generally cost-saving and more effective.
Fig. 4
Fig. 4. Distribution of diagnostic delay: standard of care versus Flo SC.
This histogram compares the distribution of diagnostic delays between standard care (blue) and Flo SC (pink), based on 1000 Monte Carlo simulations. The density curve (scaled) on the right indicates the relative frequency of each delay. The Flo SC distribution is centered around a shorter diagnostic delay (3.04 years), whereas the standard care distribution shows a longer delay (7.41 years).
Fig. 5
Fig. 5
Schematic diagram of the endometriosis care pathway model.

References

    1. Chambers, D. et al. Digital and online symptom checkers and assessment services for urgent care to inform a new digital platform: a systematic review. Health Serv. Deliv. Res.7, 1–88 (2019). - DOI - PubMed
    1. Fraser, H. et al. Comparison of diagnostic and triage accuracy of Ada Health and WebMD Symptom Checkers, ChatGPT, and physicians for patients in an emergency department: clinical data analysis study. JMIR MHealth UHealth11, e49995 (2023). - DOI - PMC - PubMed
    1. Kujala, S., Hörhammer, I., Hänninen-Ervasti, R. & Heponiemi, T. Health professionals’ experiences of the benefits and challenges of online symptom checkers. In Digital Personalized Health and Medicine (eds Pape-Haugaard, L. B. et al.) 966–970 (IOS Press, 2020). - PubMed
    1. Australia Healthdirect. Symptom Checker. https://www.healthdirect.gov.au/symptom-checker (2022).
    1. Judson, T. J. et al. Rapid design and implementation of an integrated patient self-triage and self-scheduling tool for COVID-19. J. Am. Med. Inform. Assoc.27, 860–866 (2020). - DOI - PMC - PubMed

LinkOut - more resources