We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To understand healthcare workers’ (HCWs) beliefs and practices toward blood culture (BCx) use.
Design:
Cross-sectional electronic survey and semi-structured interviews.
Setting:
Academic hospitals in the United States.
Participants:
HCWs involved in BCx ordering and collection in adult intensive care units (ICU) and wards.
Methods:
We administered an anonymous electronic survey to HCWs and conducted semi-structured interviews with unit staff and quality improvement (QI) leaders in these institutions to understand their perspectives regarding BCx stewardship between February and November 2023.
Results:
Of 314 HCWs who responded to the survey, most (67.4%) were physicians and were involved in BCx ordering (82.3%). Most survey respondents reported that clinicians had a low threshold to culture patients for fever (84.4%) and agreed they could safely reduce the number of BCx obtained in their units (65%). However, only half of them believed BCx was overused. Although most made BCx decisions as a team (74.1%), a minority reported these team discussions occurred daily (42.4%). A third of respondents reported not usually collecting the correct volume per BCx bottle, half were unaware of the improved sensitivity of 2 BCx sets, and most were unsure of the nationally recommended BCx contamination threshold (87.5%). Knowledge regarding the utility of BCx for common infections was limited.
Conclusions:
HCWs’ understanding of best collection practices and yield of BCx was limited.
In this survey of 31 hospitals, large metropolitan facilities had a 9.5-fold odds of reporting preparedness for special pathogens; hospitals with special pathogens teams had a 14.3-fold odds of reporting preparedness for special pathogens. In the postpandemic world, healthcare institutions must invest in special pathogen responses to maximize patient safety.
Cover crop residue retention on the soil surface can suppress weeds and improve organic no-till soybean (Glycine max) yield and profitability compared to a tilled system. Appropriate cereal rye (Secale cereale) fall planting date and termination methods in the spring are critical to achieve these benefits. A plot-scale agronomic experiment was carried out from September 2018 to October 2021 in Kutztown, PA, USA to demonstrate the influence of cereal rye planting date (September or October) and mechanical termination method [no-till (I & J roller-crimper, Dawn ZRX roller, and mow-ted) and tilled (plow-cultivate)] on cover crop regrowth density, weed biomass, soybean yield, and economic returns. In one out of three years, the September rye planting accumulated more cover crop biomass than the October planting, but the regrowth of the rye after roller-crimping was greater with this planting date. Cover crop planting date had no effect on total weed biomass and demonstrated varying effects on soybean grain yield and economic returns. The Dawn ZRX roller outperformed the I & J roller-crimper in effectively terminating cover crops, while the I & J roller-crimper demonstrated more uniform weed suppression and led to greater soybean yields over a span of three years. Organic no-till strategies eliminated the need for tillage and reduced variable costs by 14% over plow-cultivated plots, and generated ~19% greater net revenue across the study period (no-till vs tillage = US $845 vs US $711 ha−1). Terminating cereal rye with roller-crimping technology can be a positive investment in an organic soybean production system.
Background: Central-line–associated bloodstream infection (CLABSI) rates increased nationally during COVID-19, the drivers of which are still being characterized in the literature. CLABSI rates doubled during the SARS-CoV-2 omicron-variant surge at our rural academic medical center. We sought to identify potential drivers of CLABSIs by comparing period- and patient-specific characteristics of this COVID-19 surge to a historical control period. Methods: We defined the study period as the time of highest COVID-19 burden at our hospital (July 2021–June 2022) and the control period as the previous 2 years (July 2019–June 2021). We compared NHSN CLABSI standardized infection ratios (SIRs), central-line standardized utilization ratios (SURs), completion of practice evaluation tools (PETs) for monitoring of central-line bundle compliance, and proportions of traveling nurses. We performed chart reviews to determine patient-specific characteristics of NHSN CLABSIs during these periods, including demographics, comorbidities, central-line characteristics and care, and microbiology. Results: The CLABSI SIR was significantly higher during the study period than the control period (0.89 vs 0.52; P = .03); the SUR was significantly higher during the study period (1.08 vs 1.02; P < .01); the PET completion per 100 central-line days was significantly lower during the study period (23.0 vs 31.5; P < .01); and the proportion of traveling nurses was significantly higher during the study period (0.20 vs 0.08; P < .01) (Fig. 1). Patients with NHSN CLABSIs during the study period were more likely to have a history of COVID-19 (27% vs 3%; P = .01) and were more likely to receive a higher level of care (60% vs 27%; P = .02). During the study period, more patients had multilumen catheters (87% vs 61%; P = .04). The type of catheter, catheter care (ie, dressing changes and chlorhexidine bathing), catheter duration before CLABSI, and associated microbiology were similar between the study and control periods (Table 1). Conclusions: During the SARS-CoV-2 omicron-variant surge, the increase in CLABSIs at our hospital was significantly associated with increased central-line utilization, decreased PET completion, and increased proportion of traveling nurses. Critical illness and multilumen catheters were significant patient-specific factors that differed between CLABSIs from the study and control periods. We did not observe differences in catheter type, duration, or catheter care. Our study highlights key modifiable risk factors for CLABSI reduction. These findings may be surrogates for other difficult-to-measure challenges related to the culture of safety during a global pandemic, such as staff education related to infection prevention and daily review of central-line necessity.
To determine antibiotic prescribing appropriateness for respiratory tract diagnoses (RTD) by season.
Design:
Retrospective cohort study.
Setting:
Primary care practices in a university health system.
Patients:
Patients who were seen at an office visit with diagnostic code for RTD.
Methods:
Office visits for the entire cohort were categorized based on ICD-10 codes by the likelihood that an antibiotic was indicated (tier 1: always indicated; tier 2: sometimes indicated; tier 3: rarely indicated). Medical records were reviewed for 1,200 randomly selected office visits to determine appropriateness. Based on this reference standard, metrics and prescriber characteristics associated with inappropriate antibiotic prescribing were determined. Characteristics of antibiotic prescribing were compared between winter and summer months.
Results:
A significantly greater proportion of RTD visits had an antibiotic prescribed in winter [20,558/51,090 (40.2%)] compared to summer months [11,728/38,537 (30.4%)][standardized difference (SD) = 0.21]. A significantly greater proportion of winter compared to summer visits was associated with tier 2 RTDs (29.4% vs 23.4%, SD = 0.14), but less tier 3 RTDs (68.4% vs 74.4%, SD = 0.13). A greater proportion of visits in winter compared to summer months had an antibiotic prescribed for tier 2 RTDs (80.2% vs 74.2%, SD = 0.14) and tier 3 RTDs (22.9% vs 16.2%, SD = 0.17). The proportion of inappropriate antibiotic prescribing was higher in winter compared to summer months (72.4% vs 62.0%, P < .01).
Conclusions:
Increases in antibiotic prescribing for RTD visits from summer to winter were likely driven by shifts in diagnoses as well as increases in prescribing for certain diagnoses. At least some of this increased prescribing was inappropriate.
For 147 hospital-onset bloodstream infections, we assessed the sensitivity, specificity, positive predictive value, and negative predictive value of the National Healthcare Safety Network surveillance definitions of central-line–associated bloodstream infections against the gold standard of physician review, examining the drivers of discrepancies and related implications for reporting and infection prevention.
To estimate the association between in situ steroids and spine surgical-site infections (SSIs), assessing spinal instrumentation as an effect modifier and adjusting for confounders.
Design:
Case–control study.
Setting:
Rural academic medical center.
Participants:
We identified 1,058 adults undergoing posterior fusion and laminectomy procedures as defined by the National Healthcare Safety Network without a pre-existing SSI between January 2020 and December 2021. We identified 26 SSI as cases and randomly selected 104 controls from the remaining patients without SSI.
Methods:
The primary exposure was the intraoperative administration of methylprednisolone in situ (ie, either in the wound bed or as an epidural injection). The primary outcome was a clinical diagnosis of SSI within 6 months of a patient’s first spine surgery at our facility. We quantified the association between the exposure and outcome using logistic regression, using a product term to assess for effect modification by spinal instrumentation and the change-in-estimate approach to select significant confounders.
Results:
Adjusting for Charlson comorbidity index and malignancy, in situ steroids were significantly associated with spine SSI relative to no in situ steroids for instrumented procedures (adjusted odds ratio [aOR], 9.93; 95% confidence interval [CI], 1.54–64.0), but they were not associated with spine SSIs among noninstrumented procedures (aOR, 0.86; 95% CI, 0.15–4.93).
Conclusions:
In situ steroids were significantly associated with spine SSI among instrumented procedures. The benefits of in situ steroids for pain management following spine surgery should be weighed against the risk of SSI, especially for instrumented procedures.
In this survey of 41 hospitals, 18 (72%) of 25 respondents reporting utilization of National Healthcare Safety Network resources demonstrated accurate central-line–associated bloodstream infection reporting compared to 6 (38%) of 16 without utilization (adjusted odds ratio, 5.37; 95% confidence interval, 1.16–24.8). Adherence to standard definitions is essential for consistent reporting across healthcare facilities.
Implementation science offers a compelling value proposition to translational science. As such, many translational science stakeholders are seeking to recruit, teach, and train an implementation science workforce. The type of workforce that will make implementation happen consists of both implementation researchers and practitioners, yet little guidance exists on how to train such a workforce. We—members of the Advancing Dissemination and Implementation Sciences in CTSAs Working Group—present the Teaching For Implementation Framework to address this gap. We describe the differences between implementation researchers and practitioners and demonstrate what and how to teach them individually and in co-learning opportunities. We briefly comment on educational infrastructures and resources that will be helpful in furthering this type of approach.
To determine metrics and provider characteristics associated with inappropriate antibiotic prescribing for respiratory tract diagnoses (RTDs).
Design:
Retrospective cohort study.
Setting:
Primary care practices in a university health system.
Participants:
Patients seen by an attending physician or advanced practice provider (APP) at their primary care office visit with International Classification of Disease, Tenth Revision, Clinical Modification (ICD-10-CM)–coded RTDs.
Methods:
Medical records were reviewed for 1,200 randomly selected office visits in which an antibiotic was prescribed to determine appropriateness. Based on this gold standard, metrics and provider characteristics associated with inappropriate antibiotic prescribing were determined.
Results:
Overall, 69% of antibiotics were inappropriate. Metrics utilizing prespecified RTDs most strongly associated with inappropriate prescribing were (1) proportion prescribing for RTDs for which antibiotics are almost never required (eg, bronchitis) and (2) proportion prescribing for any RTD. Provider characteristics associated with inappropriate antibiotic prescribing were APP versus physician (72% vs 58%; P = .02), family medicine versus internal medicine (76% vs 63%; P = .01), board certification 1997 or later versus board certification before 1997 (75% vs 63%; P = .02), nonteaching versus teaching practice (73% vs 51%; P < .01), and nonurban vs urban practice (77% vs 57%; P < .01).
Conclusions:
Metrics utilizing proportion prescribing for RTDs for which antibiotics are almost never required and proportion prescribing for any RTD were most strongly associated with inappropriate prescribing. APPs and clinicians with family medicine training, with board certification 1997 or later, and who worked in nonteaching or nonurban practices had higher proportions of inappropriate prescribing. These findings could inform design of interventions to improve prescribing and could represent an efficient way to track inappropriate prescribing.
The first quarto of Hamlet is radically different from the second quarto and Folio versions of the play, and about half their length. But despite its uneven verbal texture and simpler characterisation, the first quarto presents its own workable alternatives to the longer texts, reordering and combining key plot elements, and even including a unique scene between Horatio and the Queen. This new critical edition is designed for students, scholars and actors who are intrigued by the first printed text of Shakespeare's Hamlet. Although the first quarto has been reprinted many times, there is no other modernised edition in print. Irace's introduction outlines views of its origins, its special features, and its surprisingly rich performance history; her textual notes point out differences between the first quarto and the longer second quarto and Folio versions and offer alternatives which actors or directors might choose for specific productions.
To determine the impact of total household decolonization with intranasal mupirocin and chlorhexidine gluconate body wash on recurrent methicillin-resistant Staphylococcus aureus (MRSA) infection among subjects with MRSA skin and soft-tissue infection.
DESIGN
Three-arm nonmasked randomized controlled trial.
SETTING
Five academic medical centers in Southeastern Pennsylvania.
PARTICIPANTS
Adults and children presenting to ambulatory care settings with community-onset MRSA skin and soft-tissue infection (ie, index cases) and their household members.
INTERVENTION
Enrolled households were randomized to 1 of 3 intervention groups: (1) education on routine hygiene measures, (2) education plus decolonization without reminders (intranasal mupirocin ointment twice daily for 7 days and chlorhexidine gluconate on the first and last day), or (3) education plus decolonization with reminders, where subjects received daily telephone call or text message reminders.
MAIN OUTCOME MEASURES
Owing to small numbers of recurrent infections, this analysis focused on time to clearance of colonization in the index case.
RESULTS
Of 223 households, 73 were randomized to education-only, 76 to decolonization without reminders, 74 to decolonization with reminders. There was no significant difference in time to clearance of colonization between the education-only and decolonization groups (log-rank P=.768). In secondary analyses, compliance with decolonization was associated with decreased time to clearance (P=.018).
CONCLUSIONS
Total household decolonization did not result in decreased time to clearance of MRSA colonization among adults and children with MRSA skin and soft-tissue infection. However, subjects who were compliant with the protocol had more rapid clearance
To identify risk factors for recurrent methicillin-resistant Staphylococcus aureus (MRSA) colonization.
DESIGN
Prospective cohort study conducted from January 1, 2010, through December 31, 2012.
SETTING
Five adult and pediatric academic medical centers.
PARTICIPANTS
Subjects (ie, index cases) who presented with acute community-onset MRSA skin and soft-tissue infection.
METHODS
Index cases and all household members performed self-sampling for MRSA colonization every 2 weeks for 6 months. Clearance of colonization was defined as 2 consecutive sampling periods with negative surveillance cultures. Recurrent colonization was defined as any positive MRSA surveillance culture after clearance. Index cases with recurrent MRSA colonization were compared with those without recurrence on the basis of antibiotic exposure, household demographic characteristics, and presence of MRSA colonization in household members.
RESULTS
The study cohort comprised 195 index cases; recurrent MRSA colonization occurred in 85 (43.6%). Median time to recurrence was 53 days (interquartile range, 36–84 days). Treatment with clindamycin was associated with lower risk of recurrence (odds ratio, 0.52; 95% CI, 0.29–0.93). Higher percentage of household members younger than 18 was associated with increased risk of recurrence (odds ratio, 1.01; 95% CI, 1.00–1.02). The association between MRSA colonization in household members and recurrent colonization in index cases did not reach statistical significance in primary analyses.
CONCLUSION
A large proportion of patients initially presenting with MRSA skin and soft-tissue infection will have recurrent colonization after clearance. The reduced rate of recurrent colonization associated with clindamycin may indicate a unique role for this antibiotic in the treatment of such infection.
Infect. Control Hosp. Epidemiol. 2015;36(7):786–793
Integrative, multilevel approaches investigating neurobiological systems relevant to threat detection promise to advance understanding of the pathophysiology of major depressive disorder (MDD). In this study we considered key neuronal and hormonal systems in adolescents with MDD and healthy controls (HC). The goals of this study were to identify group differences and to examine the association of neuronal and hormonal systems. MDD and HC adolescents (N = 79) aged 12–19 years were enrolled. Key brain measures included amygdala volume and amygdala activation to an emotion face-viewing task. Key hormone measures included cortisol levels during a social stress task and during the brain scan. MDD and HC adolescents showed group differences on amygdala functioning and patterns of cortisol levels. Amygdala activation in response to emotional stimuli was positively associated with cortisol responses. In addition, amygdala volume was correlated with cortisol responses, but the pattern differed in depressed versus healthy adolescents, most notably for unmedicated MDD adolescents. The findings highlight the value of using multilevel assessment strategies to enhance understanding of pathophysiology of adolescent MDD, particularly regarding how closely related biological threat systems function together while undergoing significant developmental shifts.
To develop and field test an implementation assessment tool for assessing progress of hospital units in implementing improvements for the prevention of ventilator-associated pneumonia (VAP) in a two-state collaborative, including data on actions implemented by participating teams and contextual factors that may influence their efforts. Using the data collected, learn how implementation actions can be improved and analyze effects of implementation progress on outcome measures.
Design.
We developed the tool as an interview protocol that included quantitative and qualitative items addressing actions on the Comprehensive Unit-based Safety Program (CUSP) and clinical interventions for use in guiding data collection via telephone interviews.
Setting.
We conducted interviews with leaders of improvement teams from units participating in the two-state VAP prevention initiative.
Methods.
We collected data from 43 hospital units as they implemented actions for the VAP initiative and performed descriptive analyzes of the data with comparisons across the 2 states.
Results.
Early in the VAP prevention initiative, most units had made only moderate progress overall in using many of the CUSP actions known to support their improvement processes. For contextual factors, a relatively small number of barriers were found to have important negative effects on implementation progress (in particular, barriers related to workload and time issues). We modified coaching provided to the unit teams to reinforce training in weak spots that the interviews identified.
Conclusion.
These assessments provided important new knowledge regarding the implementation science of quality improvement projects, including feedback during implementation, and give a better understanding of which factors most affect implementation.