Hostname: page-component-586b7cd67f-vdxz6 Total loading time: 0 Render date: 2024-11-29T13:08:59.310Z Has data issue: false hasContentIssue false

SHEA’s White Paper on Electronic Surveillance Data Requirements

Published online by Cambridge University Press:  05 January 2015

David Birnbaum*
Affiliation:
Applied Epidemiology, British Columbia, Canada
Rights & Permissions [Opens in a new window]

Abstract

Type
Letters to the Editor
Copyright
© 2015 by The Society for Healthcare Epidemiology of America. All rights reserved 

To the Editor—It is extremely disappointing that SHEA’s White Paper in discussing validation makes no mention of Washington State’s work.Reference Woeltje, Lin, Klompas, Wright, Zuccotti and Trick1 Last year in SHEA’s own journal, Washington State was recognized by leaders from several divisions of the American Society for Quality as the only one doing reporting validation of healthcare-associated infections by a protocol consistent with American (Department of Defense MIL-STD-105 and American National Standards Institute Z1.4) and international (International Organization for Standardization 2859) standards for acceptance sampling.Reference Fortuna, Brenneman, Storli, Birnbaum and Brown2 Throughout 5 years of continual operation, the Washington State Department of Health’s Healthcare Associated Infections Program annual validation protocol has proven practical for infection control programs in hospitals of all sizes, credible to certified quality professionals by virtue of respecting their profession’s long-established generic standards, sustainable, and scalable.Reference Birnbaum and Fortuna3, Reference Lempp, Cummings, Lovinger and Birnbaum4 A technical reference manual, fully detailing all aspects of theory and practice, has been freely available since 2010.5 Conversely, the other approaches cited by Woeltje et alReference Woeltje, Lin, Klompas, Wright, Zuccotti and Trick1 variously fail to document underlying statistical theory such that their sample size appears arbitrary (thus lack statistical power details); oversample large hospitals while exempting smaller ones (thus may not build overall public confidence nor ensure all facilities subject to public comparisons are on a level playing field); fail to set and enforce a prespecified level of sensitivity and specificity performance (thus do not accomplish the quality assurance that validation is understood to provide in all other industries); and appear to require larger workloads than the method used by Washington State (thus may not be the most cost-effective). In my own experience, it is essential to review each entire clinical and laboratory record for “external” validation of sampled cases, best done on a site visit, and then discuss results with local program leadership, rather than to rely solely on laboratory information systems or remote access for “external” validation. Furthermore, it is not logical or reasonable for electronic surveillance oversight to exempt itself from the generic validation methodologic standards respected in all other industries. Fortuna et alReference Fortuna, Brenneman, Storli, Birnbaum and Brown2 suggest that a naïve and narrow understanding of validation among epidemiologists is due to quality assurance being an unfamiliar statistical specialty. Like Washington State’s program, in matters of validation SHEA should be collaborating with the expertise of certified quality engineers, certified quality managers, and certified quality auditors of pertinent American Society for Quality divisions (eg, its healthcare, biomedical, statistics, and government divisions).

Acknowledgments

Financial support. None reported.

Potential conflicts of interest. D.B. reports that he was the Washington State Healthcare Associated Infections Program Manager from 2008 until May 2014.

References

1.Woeltje, KF, Lin, MY, Klompas, M, Wright, MO, Zuccotti, G, Trick, WE. Data requirements for electronic surveillance of healthcare-associated infections. Infect Control Hosp Epidemiol 2014;35:10831091.CrossRefGoogle ScholarPubMed
2.Fortuna, JA, Brenneman, W, Storli, S, Birnbaum, D, Brown, KL. The current state of validating the accuracy of clinical data reporting: lessons to be learned from quality and process improvement scientists. Infect Control Hosp Epidemiol 2013;34:611614.CrossRefGoogle ScholarPubMed
3.Birnbaum, D, Fortuna, JA. Washington State's experience applying ISO 2859 validation methods to healthcare associated infections (HAI) reporting. In: Program and abstracts of Council of State & Territorial Epidemiologists Annual Conference; June 9-13, 2013; Pasadena, CA. Abstract 1386.Google Scholar
4.Lempp, JM, Cummings, MJ, Lovinger, PG, Birnbaum, DW. Cost of a sustainable annual validation process to ensure credibility of state HAI reporting. In: Program and abstracts of Council of State & Territorial Epidemiologists Annual Conference; June 22-26, 2014; Nashville, TN. Abstract 135.Google Scholar
5.Healthcare Associated Infections Program Group Administrator Instructions for Validation of Surveillance Programs. Document available upon request to Washington State Health Department’s Healthcare Associated Infections Program.Google Scholar