Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2004:20 Suppl 1:S34-43.
doi: 10.1590/s0102-311x2004000700004. Epub 2004 May 20.

[Consistency between data sources and inter-observer reliability in the Study on Neonatal and Perinatal Morbidity and Mortality and Care in the City of Rio de Janeiro]

[Article in Portuguese]
Affiliations
Free article

[Consistency between data sources and inter-observer reliability in the Study on Neonatal and Perinatal Morbidity and Mortality and Care in the City of Rio de Janeiro]

[Article in Portuguese]
Mônica Rodrigues Campos et al. Cad Saude Publica. 2004.
Free article

Abstract

The objective of this study was to evaluate the quality of data in the research project entitled "Study on Neonatal and Perinatal Morbidity and Mortality and Care in the City of Rio de Janeiro", analyzing the completeness of patient records, inter-observer reliability, and concordance of collected data. The study interviewed a sample of 10,072 post-partum women, corresponding to 10.0% of the deliveries in the City of Rio de Janeiro. This article analyzed the concordance between data on patient records and the reproducibility of questionnaires by mean of rest/ retest, using common and prevalence-adjusted Kappa as well as the intra-class correlation coefficient. Losses totaled 4.5%, and the proportion of unknown data on patient records varied from 3.0 to 90.0%. Lower proportions were concentrated in the neonatal assessment and higher ones in the data on maternal hospital admission. There was a high concordance between the data reported by the mother and that written on the patient record, with the Kappa varying from 0.77 to 0.96. In the test/retest verification, Kappa varied from 0.61 to 0.94. This study demonstrated a high inter-observer reliability as well as reliability between the different data sources (interviewee and patient record).

PubMed Disclaimer

Publication types

LinkOut - more resources