A new study found that blood culture contamination metrics may be inaccurately measured at many hospitals because “there is not a standard definition of what constitutes contamination in a blood culture.”
Blood cultures are “widely used by acute care hospitals to find infections as early as possible, identify their cause, and guide appropriate treatment, particularly with antibiotics. Contamination of blood cultures may yield false positives or inaccurate diagnoses of bacterial infections and, in turn, may lead to unnecessary antibiotic exposure and prolonged length of hospitalization.”
U.S. healthcare facilities strive to limit contamination to “less than 3% of all blood cultures done, with an optimal goal of 1% or less.” Researchers surveyed 52 acute care hospitals across 19 states, analyzing “more than 360,000 blood cultures collected over a two-year period,” to determine how often that goal was being achieved.
The researchers found that “of the 52 hospitals surveyed, 65.4% used criteria from the CLSI or the College of American Pathologists (CAP) to define BCC, while 17.3% each used locally defined criteria or the CAP/CLSI criteria accompanied by the comprehensive list of nonpathogenic skin surface microorganisms (known as skin commensals) from the National Healthcare Safety Network (NHSN).” Only around “50% of the hospitals targeted a BCC threshold of less than 3%.”
The BCC rates were also “higher when the NHSN commensal list of microorganisms was considered.” This suggests that the lack of standardization across hospitals may be leading to inaccurate reporting on BCC rates.