A cognitive evaluation of four online search engines for answering definitional questions posed by physicians
- PMID: 17990503
A cognitive evaluation of four online search engines for answering definitional questions posed by physicians
Abstract
The Internet is having a profound impact on physicians' medical decision making. One recent survey of 277 physicians showed that 72% of physicians regularly used the Internet to research medical information and 51% admitted that information from web sites influenced their clinical decisions. This paper describes the first cognitive evaluation of four state-of-the-art Internet search engines: Google (i.e., Google and Scholar.Google), MedQA, Onelook, and PubMed for answering definitional questions (i.e., questions with the format of "What is X?") posed by physicians. Onelook is a portal for online definitions, and MedQA is a question answering system that automatically generates short texts to answer specific biomedical questions. Our evaluation criteria include quality of answer, ease of use, time spent, and number of actions taken. Our results show that MedQA outperforms Onelook and PubMed in most of the criteria, and that MedQA surpasses Google in time spent and number of actions, two important efficiency criteria. Our results show that Google is the best system for quality of answer and ease of use. We conclude that Google is an effective search engine for medical definitions, and that MedQA exceeds the other search engines in that it provides users direct answers to their questions; while the users of the other search engines have to visit several sites before finding all of the pertinent information.
Similar articles
-
Development, implementation, and a cognitive evaluation of a definitional question answering system for physicians.J Biomed Inform. 2007 Jun;40(3):236-51. doi: 10.1016/j.jbi.2007.03.002. Epub 2007 Mar 12. J Biomed Inform. 2007. PMID: 17462961
-
Speed, accuracy, and confidence in Google, Ovid, PubMed, and UpToDate: results of a randomised trial.Postgrad Med J. 2010 Aug;86(1018):459-65. doi: 10.1136/pgmj.2010.098053. Postgrad Med J. 2010. PMID: 20709767 Clinical Trial.
-
Beyond information retrieval--medical question answering.AMIA Annu Symp Proc. 2006;2006:469-73. AMIA Annu Symp Proc. 2006. PMID: 17238385 Free PMC article.
-
An Improved Forensic Science Information Search.Forensic Sci Rev. 2015 Jan;27(1):41-52. Forensic Sci Rev. 2015. PMID: 26227137 Review.
-
Medical literature search dot com.Indian J Dermatol Venereol Leprol. 2011 Mar-Apr;77(2):135-40. doi: 10.4103/0378-6323.77451. Indian J Dermatol Venereol Leprol. 2011. PMID: 21393941 Review.
Cited by
-
List-wise learning to rank biomedical question-answer pairs with deep ranking recursive autoencoders.PLoS One. 2020 Nov 9;15(11):e0242061. doi: 10.1371/journal.pone.0242061. eCollection 2020. PLoS One. 2020. PMID: 33166367 Free PMC article.
-
AskHERMES: An online question answering system for complex clinical questions.J Biomed Inform. 2011 Apr;44(2):277-88. doi: 10.1016/j.jbi.2011.01.004. Epub 2011 Jan 21. J Biomed Inform. 2011. PMID: 21256977 Free PMC article.
-
A Natural Language Processing System That Links Medical Terms in Electronic Health Record Notes to Lay Definitions: System Development Using Physician Reviews.J Med Internet Res. 2018 Jan 22;20(1):e26. doi: 10.2196/jmir.8669. J Med Internet Res. 2018. PMID: 29358159 Free PMC article.
-
Automatically extracting information needs from complex clinical questions.J Biomed Inform. 2010 Dec;43(6):962-71. doi: 10.1016/j.jbi.2010.07.007. Epub 2010 Jul 27. J Biomed Inform. 2010. PMID: 20670693 Free PMC article.
-
Using the weighted keyword model to improve information retrieval for answering biomedical questions.Summit Transl Bioinform. 2009 Mar 1;2009:143-7. Summit Transl Bioinform. 2009. PMID: 21347188 Free PMC article.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources