We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
This journal utilises an Online Peer Review Service (OPRS) for submissions. By clicking "Continue" you will be taken to our partner site
https://www.editorialmanager.com/ashe/default.aspx.
Please be aware that your Cambridge account is not valid for this OPRS and registration is required. We strongly advise you to read all "Author instructions" in the "Journal information" area prior to submitting.
To save this undefined to your undefined account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you used this feature, you will be asked to authorise Cambridge Core to connect with your undefined account.
Find out more about saving content to .
To send this article to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Assess the efficacy of staged interventions aimed to reduce inappropriate Clostridioides difficile testing and hospital-onset C. difficile infection (HO-CDI) rates.
Design:
Interrupted time series.
Setting:
Community-based.
Methods/Interventions:
National Healthcare Safety Network (NHSN) C. difficile metrics from January 2019 to November 2022 were analyzed after three interventions at a community-based healthcare system. Interventions included: (1) an electronic medical record (EMR) based hard stop requiring confirming ≥3 loose or liquid stools over 24 h, (2) an infectious diseases (ID) review and approval of testing >3 days of hospital admission, and (3) an infection control practitioner (ICP) reviews combined with switching to a reverse two-tiered clinical testing algorithm.
Results:
After all interventions, the number of C. difficile tests per 1,000 patient-days (PD) and HO-CDI cases per 10,000 PD decreased from 20.53 to 6.92 and 9.80 to 0.20, respectively. The EMR hard stop resulted in a (28%) reduction in the CDI testing rate (adjusted incidence rate ratio ((aIRR): 0.72; 95% confidence interval [CI], 0.53 to 0.96)) and ID review resulted in a (42%) reduction in the CDI testing rate (aIRR: 0.58; 95% CI, 0.42–0.79). Changing to the reverse testing algorithm reduced reported HO-CDI rate by (95%) (cIRR: 0.05; 95% CI; 0.01–0.40).
Conclusions:
Staged interventions aimed at improving diagnostic stewardship were effective in overall reducing CDI testing in a community healthcare system.
In this manuscript, we highlight current literature on environmental hygiene techniques to combat reservoirs of antibiotic resistant organisms in the healthcare environment. We discuss several topics for each strategy, including mechanism of action, assessment of effectiveness based on studies, cost, and real-world translatability. The techniques and topics summarized here are not inclusive of all available environmental hygiene techniques but highlight some of the more popular and investigated strategies. We focus on the following: Ultraviolet radiation, hydrogen peroxide vapor, copper-coated surfaces, phages, interventions involving sinks, and educational initiatives.
Of 313 patients whose outpatient parenteral antimicrobial therapy was managed by an ID physician, only 39 [12.5%, 95% CI (8.8%–16.1%)] had clinical decisions influenced by erythrocyte sedimentation rate (ESR), C-reactive protein (CRP), or both. ESR/CRP ordering was associated with $530 in excess cost per treatment course (average duration 5.1 weeks) representing a diagnostic stewardship opportunity.
Patients discharged from emergency departments (ED) with antibiotics for common infections often receive unnecessarily prolonged durations, representing a target for transition of care (TOC) antimicrobial stewardship intervention.
Methods:
This study aimed to evaluate the effectiveness of TOC pharmacists’ review on decreasing the duration of discharge oral antibiotics in patients discharged from the ED at an academic medical center. Pharmacist interventions were guided by an antibiotic duration of therapy guidance focused on respiratory, urinary, and skin infections developed and implemented by the antimicrobial stewardship program. Pharmacist interventions from January 27, 2023, to December 29, 2023, were analyzed to quantify the total number of antibiotic days saved and the percentage of provider acceptance.
Results:
The ED TOC pharmacists reviewed a total of 157 oral antibiotic prescriptions. 86.6% percent of the reviews required pharmacist interventions. The most common indications for the discharge antibiotics were urinary tract infections (50.0%) and skin infections (23.4%). The total number of antibiotic days saved was 155 days with the provider acceptance rate of 76.5%. In 21% of cases, providers did not count the antibiotic doses administered in the ED, contributing to unnecessarily prolonged duration. 10.2% of patients re-presented to the ED while 6.4% of patients were hospitalized within 30 days of index ED discharge.
Conclusion:
The transitions of care pharmacist-led intervention was successful in optimizing the duration of discharge oral antibiotics in the ED utilizing prospective audit and feedback based on institutional guidance. The ED represents a high-yield setting for TOC-directed antimicrobial stewardship.
The high cost of antimicrobials presents critical challenges for healthcare providers managing infections amidst the growing threat of antimicrobial resistance (AMR). High costs hinder access to necessary treatments, disproportionately affecting disadvantaged populations and exacerbating health disparities. High drug prices necessitate the use of less effective or more toxic alternatives, leading to suboptimal outcomes and prolonged hospitalizations. This, in turn, increases healthcare costs and undermines efforts to combat AMR. Equitable policies, national formularies, and cost caps for essential antimicrobials can ensure universal access to life-saving treatments and enable antimicrobial stewardship programs to ensure the best possible outcomes.
Rapid blood culture identification is most effective with antimicrobial stewardship feedback, which is limited during non-business hours. We implemented overnight review of Blood Culture Identification 2 panel results by intensive care unit pharmacists and demonstrated reduced time to evaluation (3.6 vs 9.3 hours, P < .01).
To evaluate the impact of implementing a clinical care guideline for uncomplicated gram-negative bloodstream infections (GN-BSI) within a health system.
Design:
Retrospective, quasi-experimental study.
Setting:
A large academic safety-net institution.
Participants:
Adults (≥18 years) with GN-BSI, defined by at least one positive blood culture for specific gram-negative organisms. Patients with polymicrobial cultures or contaminants were excluded.
Interventions:
Implementation of a GN-BSI clinical care guideline based on a 2021 consensus statement, emphasizing 7-day antibiotic courses, use of highly bioavailable oral antibiotics, and minimizing repeat blood cultures.
Results:
The study included 147 patients pre-intervention and 169 post-intervention. Interrupted time series analysis showed a reduction in the median duration of therapy (–2.3 days, P = .0016), with a sustained decline (slope change –0.2103, P = .005) post-intervention. More patients received 7 days of therapy (12.9%–58%, P < .01), oral antibiotic transitions increased (57.8% vs 72.2%, P < .05), and guideline-concordant oral antibiotic selection was high. Repeat blood cultures decreased (50.3% vs 30.2%, P < .01) without an increase in recurrent bacteremia. No significant differences were observed in 90-day length of stay, rehospitalization, recurrence, or mortality.
Conclusions:
Guideline implementation was associated with shorter antibiotic therapy durations, increased use of guideline-concordant oral antibiotics, and fewer repeat blood cultures without compromising patient outcomes. These findings support the effectiveness of institutional guidelines in standardizing care, optimizing resource utilization, and promoting evidence-based practices in infectious disease management.
Evaluate prescribing practices and risk factors for treatment failure in obese patients treated for purulent cellulitis with oral antibiotics in the outpatient setting.
Design:
Retrospective, multicenter, observational cohort.
Setting:
Emergency departments, primary care, and urgent care sites throughout Michigan.
Patients:
Adult patients with a body mass index of ≥ 30 kg/m2 who received ≥ 5 days of oral antibiotics for purulent cellulitis were included. Key exclusion criteria were chronic infections, antibiotic treatment within the past 30 days, and suspected polymicrobial infections.
Methods:
Obese patients receiving oral antibiotics for purulent cellulitis between February 1, 2020, and August 31, 2023, were assessed. The primary objective was to describe outpatient prescribing trends. Secondary objectives included comparing patient risk factors for treatment failure and safety outcomes between patients experiencing treatment success and those experiencing treatment failure.
Results:
Two hundred patients were included (Treatment success, n = 100; Treatment failure, n = 100). Patients received 11 antibiotic regimens with 26 dosing variations; 45.5% were inappropriately dosed. Sixty-seven percent of patients received MRSA-active therapy. Treatment failure was similar between those appropriately dosed (46.4%) versus under-dosed (54.4%) (P = 0.256), those receiving 5–7 days of therapy (47.1%) versus 10–14 days (54.4%) (P = 0.311), and those receiving MRSA-active therapy (52.2%) versus no MRSA therapy (45.5%) (P = 0.367). Patients treated with clindamycin were more likely to experience treatment failure (73.7% vs 47.5%, P = 0.030).
Conclusions:
Nearly half of antimicrobial regimens prescribed for outpatient treatment of cellulitis in patients with obesity were suboptimally prescribed. Opportunities exist to optimize agent selection, dosing, and duration of therapy in this population.
The overuse and inappropriate use of antimicrobials have led to environmental waste and drug shortages. This challenges the ecological and economical sustainability of our healthcare system and worsens antimicrobial resistance.
Antimicrobial stewardship programs (ASP) commonly consider the cost of drug acquisition but may be failing to recognize the hidden costs of multi-dose intravenous regimens including additional nursing administration time, tubing and fluids, and potentially increased hospital length of stay. They also rarely consider the environmental impact of medical waste creation and disposal, which contributes to the global antimicrobial resistance crisis. These costs are harder to calculate but crucial to a comprehensive assessment of a medication’s total impact. In this invited commentary, we provide an example of a stewardship evaluation at our institution focused on changing from meropenem (MER) to ertapenem (ETP) for infections caused by extended-spectrum beta-lactamase producing organisms. We found that despite an increase in acquisition costs, changing from MER to ETP is associated with overall savings and decreased waste production. A secondary analysis suggests that stay length may also be improved with this substitution.
We present a holistic approach to antimicrobial stewardship that considers the total cost of an antimicrobial. By broadening their view to include hidden costs and secondary effects, ASPs can further demonstrate their value to the healthcare system, reduce resistance, and improve their environmental impact.
At Saint George Hospital University Medical Center in Beirut, Lebanon, we determine (1) annual blood culture (BC) contamination (BCC) and utilization (BCU) rates vs international benchmarks, (2) identify blood culture contaminants, (3) bloodstream infections episodes in patients with and without COVID-19 after the pandemic onset, and (4) any epidemiologic trends in BCC and BCU.
Design:
Retrospective observational study.
Setting:
Private tertiary referral center, from January 1, 2010, to December 31, 2022.
Methods:
We define a contaminated BC as the growth of a typical contaminant/skin flora in 1-2/4 BC bottles. We calculate BCC rates as a percentage of the contaminated BC/total BC during the period and BCU rates as the number of BC/1000 patient days (PD).
Results:
The average BCU rate of 85.9/1000 PD in 2010–2019 increased to 106.6/1000 PD in 2020–2022. On average, patients with COVID-19 had a higher BCU rate of 185.9/1000 PD, corresponding to an additional 100 blood cultures/1000 PD. The average BCC rate was 7%, ranging from 6% in 2010–2019 to 8% in 2020–2022. We observed the highest BCC rate of 9% in patients with COVID-19, likely due to the higher BCU. The most frequently isolated contaminants were coagulase-negative Staphylococcus (96%), of which 65% were Staphylococcus epidermidis.
Conclusion:
We saw a multifactorial, persistently elevated rate of BCC over 13 years as unaffected by strict infection control practices. We think that further research targeting a standardized, low BCU rather than inevitable BCC while advocating for diagnostic stewardship of low-middle-income countries is essential, especially where the lack of appropriate resource allocation and awareness are problematic.
A β–lactam plus a macrolide or a respiratory fluoroquinolone alone is recommended as standard empiric antibacterial therapy for non-severe adults hospitalized with community-acquired pneumonia (CAP) per Infectious Diseases Society of America guidelines. However, the evidence in support of adding empiric atypical antibacterial therapy, and specifically the addition of a macrolide, is conflicting and should be balanced with additional factors: the necessity of covering atypical organisms, benefits of macrolide-associated immunomodulation, harms associated with antibiotic use, and selection for antibiotic-resistant organisms. In this review, we examine the role of atypical coverage in standard treatment regimens for patients admitted with non-severe CAP and specifically focus on the addition of macrolides to β–lactams. We conclude that a subset of patients should not be given atypical coverage as part of their regimen.
The COVID-19 pandemic highlighted gaps in infection control knowledge and practice across health settings nationwide. The Centers for Disease Control and Prevention, with funding through the American Rescue Plan, developed Project Firstline. Project Firstline is a national collaborative aiming to reach all aspects of the health care frontline. The American Medical Association recruited eight physicians and one medical student to join their director of infectious diseases to develop educational programs targeting knowledge gaps. They have identified 5 critical areas requiring national attention.
To determine and compare the intraoperative durability of 4 major surgical glove brands.
Design, Setting, and Participants:
This study is a randomized open-label clinical trial in which surgical gloves from 4 manufacturers are randomized to 5 surgical subspecialty study groups: (1) orthopedic surgery, (2) neurosurgery, (3) colorectal surgery, (4) trauma or acute general surgery, and (5) cardiac and plastic surgeries. The study was divided into 10 periods, with a cross-over design, and was conducted at a tertiary care academic medical center. Participants were licensed and certified physicians, physicians-in-training, scrub nurses, or technicians working within the sterile field.
Interventions:
Participants from each study group were randomly assigned to 1 of 4 surgical glove manufacturer types and subsequently rotated through the other 3 glove brands such that each participant acted as their own control in the sequential cross-over design.
Main Outcomes and Measures:
The primary outcome was to determine and compare the intraoperative failure rate of Biogel® Sterile Surgical undergloves against sterile surgical undergloves from 3 other manufacturers, both as a combined competitor group and individually.
Results:
There were no differences between brands with respect to the primary outcome of underglove intraoperative failures. Brand 1 wearers were slightly more likely to detect glove failures when they occurred.
Conclusion:
The durability of surgical gloves intraoperatively is similar across 4 major glove manufacturers. Detection of intraoperative failures is infrequent, though specific glove characteristics may promote enhanced detection. Recognition of glove perforations intraoperatively is important in the maintenance of a maximally sterile field.
Prior studies evaluating the impact of discontinuation of contact precautions (DcCP) on methicillin-resistant Staphylococcus aureus (MRSA) outcomes have characterized all healthcare-associated infections (HAIs) rather than those likely preventable by contact precautions. We aimed to analyze the impact of DcCP on the rate of MRSA HAI including transmission events identified through whole genome sequencing (WGS) surveillance.
Design:
Quasi experimental interrupted time series.
Setting:
Acute care medical center.
Participants:
Inpatients.
Methods:
The effect of DcCP (use of gowns and gloves) for encounters among patients with MRSA carriage was evaluated using time series analysis of MRSA HAI rates from January 2019 through December 2022, compared to WGS-defined attributable transmission events before and after DcCP in December 2020.
Results:
The MRSA HAI rate was 4.22/10,000 patient days before and 2.98/10,000 patient days after DcCP (incidence rate ratio [IRR] 0.71 [95% confidence interval 0.56–0.89]) with a significant immediate decrease (P = .001). There were 7 WGS-defined attributable transmission events before and 11 events after DcCP (incident rate ratio 0.90 [95% confidence interval 0.30–2.55]).
Conclusions:
DcCP did not result in an increase in MRSA HAI or, in WGS-defined attributable transmission events. Comprehensive analyses of the effect of transmission prevention measures should include outcomes specifically measuring transmission-associated HAI.
This study aimed to assess the actual burden of antibiotic use among end-of-life (EOL) patients in South Korea and to compare trends between cancer and non-cancer decedents.
Design:
Population-based mortality follow-back study.
Setting:
Data from the Korean National Health Insurance Database, covering the period from January1, 2006, to December 31, 2018, provided for research by the National Health Insurance Service (NHIS), were used.
Participants:
All decedents from 2006 to 2018 were included and categorized as cancer decedents or non-cancer decedents.
Methods:
Annual antibiotic consumption rates and prescription rates were calculated, and Poisson regression was used to estimate their trends.
Results:
Overall antibiotic consumption rates decreased slightly among decedents in their final month with a less pronounced annual decrease rate among cancer decedents compared to non-cancer decedents (0.4% vs 2.3% per year, P <.001). Over the study period, although narrow spectrum antibiotics were used less, utilization and prescription of broad-spectrum antibiotics steadily increased, and prescription rates were higher in cancer decedents compared to non-cancer controls. Specifically, carbapenem prescription rates increased from 5.6% to 18.5%, (RR 1.087, 95% CI 1.085–1.088, P <.001) in cancer decedents and from 2.9% to 13.2% (RR 1.115, 95% CI 1.113–1.116, P <.001) in non-cancer decedents.
Conclusions:
Our findings show that patients at the EOL, especially those with cancer, are increasingly and highly exposed to broad-spectrum antibiotics. Measures of antibiotic stewardship are required among this population.
The past 10 years have brought paradigm-shifting changes to clinical microbiology. This paper explores the top 10 transformative innovations across the diagnostic spectrum, including not only state of the art technologies but also preanalytic and post-analytic advances. Clinical decision support tools have reshaped testing practices, curbing unnecessary tests. Innovations like broad-range polymerase chain reaction and metagenomic sequencing, whole genome sequencing, multiplex molecular panels, rapid phenotypic susceptibility testing, and matrix-assisted laser desorption ionization time-of-flight mass spectrometry have all expanded our diagnostic armamentarium. Rapid home-based testing has made diagnostic testing more accessible than ever. Enhancements to clinician-laboratory interfaces allow for automated stewardship interventions and education. Laboratory restructuring and consolidation efforts are reshaping the field of microbiology, presenting both opportunities and challenges for the future of clinical microbiology laboratories. Here, we review key innovations of the last decade.
Through the Centers for Medicare and Medicaid Services Promoting Interoperability Program, more hospitals will be reporting to the National Healthcare Safety Network Antimicrobial Use (AU) Option. We highlight the next steps and opportunities for measurement of AU to optimize prescribing.