Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Comparative Study
. 1998 Oct;3(4):203-6.
doi: 10.1177/135581969800300404.

Can we trust the quality of routine hospital outpatient information in the UK? Validating outpatient data from the patient administration system (PAS)

Affiliations
Comparative Study

Can we trust the quality of routine hospital outpatient information in the UK? Validating outpatient data from the patient administration system (PAS)

A Shaw et al. J Health Serv Res Policy. 1998 Oct.

Abstract

Background: A validation study of routine hospital outpatient data was carried out as part of a broader project focusing on outpatient re-attendance. The aim was to compare two patient administration system (PAS) data items with the same information collected directly from hospital clinicians.

Methods: A total of 140 cases from four specialties at four National Health Service hospitals was randomly selected for comparison. The specific data items compared were the grade of doctor seen and the management decision taken following an outpatient appointment. The proportion of cases in which there was agreement was calculated, together with kappa values and relevant statistics indicating the accuracy of the PAS data when compared with information compiled immediately after the consultation by the relevant clinician.

Results: There was agreement between the clinician's data and the PAS data in 118/140 (84.3%) cases for grade of doctor seen and in 105/139 (76.7%) cases for the management decision. There was complete agreement for both items in 88/139 (62.6%) cases. Kappa values indicated good agreement between the two data sources. However, 'sensitivity' statistics suggested that the likely accuracy of each data item varied.

Conclusion: Although there was good agreement within individual categories between the two sources, 37% of patient computerised records held at least one inconsistency in this small study focusing on only two data items. Further systematic evaluation is needed to test the extent to which other items are similarly discrepant.

PubMed Disclaimer

Publication types

MeSH terms

LinkOut - more resources