We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Incorporating paleontological data into phylogenetic inference can greatly enrich our understanding of evolutionary relationships by providing insights into the diversity and morphological evolution of a clade over geological timescales. Phylogenetic analysis of fossil data has been significantly aided by the introduction of the fossilized birth–death (FBD) process, a model that accounts for fossil sampling through time. A decade on from the first implementation of the FBD model, we explore its use in more than 170 empirical studies, summarizing insights gained through its application. We identify a number of challenges in applying the model in practice: it requires a working knowledge of paleontological data and their complex properties, Bayesian phylogenetics, and the mechanics of evolutionary models. To address some of these difficulties, we provide an introduction to the Bayesian phylogenetic framework, discuss important aspects of paleontological data, and finally describe the assumptions of the models used in paleobiology. We also present a number of exemplar empirical studies that have used the FBD model in different ways. Through this review, we aim to provide clarity on how paleontological data can best be used in phylogenetic inference. We hope to encourage communication between model developers and empirical researchers, with the ultimate goal of developing models that better reflect the data we have and the processes that generated them.
Background: Efgartigimod, a human immunoglobulin G (IgG)1 antibody Fc fragment, blocks the neonatal Fc receptor, decreasing IgG recycling and reducing pathogenic IgG autoantibody levels. ADHERE assessed the efficacy and safety of efgartigimod PH20 subcutaneous (SC; co-formulated with recombinant human hyaluronidase PH20) in chronic inflammatory demyelinating polyneuropathy (CIDP). Methods: ADHERE enrolled participants with CIDP (treatment naive or on standard treatments withdrawn during run-in period) and consisted of open-label Stage A (efgartigimod PH20 SC once weekly [QW]), and randomized (1:1) Stage B (efgartigimod or placebo QW). Primary outcomes were clinical improvement (assessed with aINCAT, I-RODS, or mean grip strength; Stage A) and time to first aINCAT score deterioration (relapse; Stage B). Secondary outcomes included treatment-emergent adverse events (TEAEs) incidence. Results: 322 participants entered Stage A. 214 (66.5%) were considered responders, randomized, and treated in Stage B. Efgartigimod significantly reduced the risk of relapse (HR: 0.394; 95% CI: 0.25–0.61) versus placebo (p=0.000039). Reduced risk of relapse occurred in participants receiving corticosteroids, intravenous or SC immunoglobulin, or no treatment before study entry. Most TEAEs were mild to moderate; 3 deaths occurred, none related to efgartigimod. Conclusions: Participants treated with efgartigimod PH20 SC maintained a clinical response and remained relapse-free longer than those treated with placebo.
Evidence from previous research suggests that frame-of-reference (FOR) training is effective at improving assessor ratings in many organizational settings. Yet no research has presented a thorough examination of systematic sources of variance (assessor-related effects, evaluation settings, and measurement design features) that might influence training effectiveness. Using a factorial ANOVA and variance components analyses on a database of four studies of frame-of-reference assessor training, we found that (a) training is most effective at identifying low levels of performance and (b) the setting of the training makes little difference with respect to training effectiveness. We also show evidence of the importance of rater training as a key determinant of the quality of performance ratings in general. Implications for FOR training theory and practice are discussed.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
This study examined struggles to establish autonomy and relatedness with peers in adolescence and early adulthood as predictors of advanced epigenetic aging assessed at age 30. Participants (N = 154; 67 male and 87 female) were observed repeatedly, along with close friends and romantic partners, from ages 13 through 29. Observed difficulty establishing close friendships characterized by mutual autonomy and relatedness from ages 13 to 18, an interview-assessed attachment state of mind lacking autonomy and valuing of attachment at 24, and self-reported difficulties in social integration across adolescence and adulthood were all linked to greater epigenetic age at 30, after accounting for chronological age, gender, race, and income. Analyses assessing the unique and combined effects of these factors, along with lifetime history of cigarette smoking, indicated that each of these factors, except for adult social integration, contributed uniquely to explaining epigenetic age acceleration. Results are interpreted as evidence that the adolescent preoccupation with peer relationships may be highly functional given the relevance of such relationships to long-term physical outcomes.
Introduction. While many individuals quit smoking during pregnancy, most relapse within one year postpartum. Research into methods to decrease smoking relapse postpartum has been hampered by difficulties with recruitment. Method. We conducted individual interviews with pregnant women (N = 22) who were interested in quitting smoking while pregnant about their attitudes regarding smoking and quitting during pregnancy, clinical trial participation, and smoking cessation medication use. Results. Participants were aware of the risks of smoking while pregnant. Many wanted to quit smoking before delivery. Few used empirically supported treatments to quit. While research was viewed positively, interest in taking on new commitments postpartum and taking a medication to prevent relapse was low. Medication concerns were evident among most participants, especially among those planning to breastfeed. Further, several women noted medication was unnecessary, as they did not believe they would relapse postpartum. Financial incentives, childcare, and fewer and/or remote visits were identified as facilitators to participating in research. However, these factors did not outweigh women’s concerns about medication use and time commitments. Conclusions. Women are aware that quitting smoking during pregnancy and remaining smoke-free postpartum are important. However, beliefs that personal relapse risk is low and that medications are dangerous reduced enthusiasm for taking medication for postpartum relapse prevention. Future medication trials should educate women about the high likelihood of relapse, prepare to answer detailed questions about risks of cessation medications, and connect with participants’ clinicians. For new mothers, studies conducted remotely with few scheduled appointments would reduce barriers to participation.
Performance characteristics of SARS-CoV-2 nucleic acid detection assays are understudied within contexts of low pre-test probability, including screening asymptomatic persons without epidemiological links to confirmed cases, or asymptomatic surveillance testing. SARS-CoV-2 detection without symptoms may represent presymptomatic or asymptomatic infection, resolved infection with persistent RNA shedding, or a false-positive test. This study assessed the positive predictive value of SARS-CoV-2 real-time reverse transcription polymerase chain reaction (rRT-PCR) assays by retesting positive specimens from 5 pre-test probability groups ranging from high to low with an alternate assay.
Methods:
In total, 122 rRT-PCR positive specimens collected from unique patients between March and July 2020 were retested using a laboratory-developed nested RT-PCR assay targeting the RNA-dependent RNA polymerase (RdRp) gene followed by Sanger sequencing.
Results:
Significantly fewer (15.6%) positive results in the lowest pre-test probability group (facilities with institution-wide screening having ≤3 positive asymptomatic cases) were reproduced with the nested RdRp gene RT-PCR assay than in each of the 4 groups with higher pre-test probability (individual group range, 50.0%–85.0%).
Conclusions:
Large-scale SARS-CoV-2 screening testing initiatives among low pre-test probability populations should be evaluated thoroughly prior to implementation given the risk of false-positive results and consequent potential for harm at the individual and population level.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
Colleges and universities around the world engaged diverse strategies during the COVID-19 pandemic. Baylor University, a community of ˜22,700 individuals, was 1 of the institutions which resumed and sustained operations. The key strategy was establishment of multidisciplinary teams to develop mitigation strategies and priority areas for action. This population-based team approach along with implementation of a “Swiss Cheese” risk mitigation model allowed small clusters to be rapidly addressed through testing, surveillance, tracing, isolation, and quarantine. These efforts were supported by health protocols including face coverings, social distancing, and compliance monitoring. As a result, activities were sustained from August 1 to December 8, 2020. There were 62,970 COVID-19 tests conducted with 1435 people testing positive for a positivity rate of 2.28%. A total of 1670 COVID-19 cases were identified with 235 self-reports. The mean number of tests per week was 3500 with approximately 80 of these positive (11/d). More than 60 student tracers were trained with over 120 personnel available to contact trace, at a ratio of 1 per 400 university members. The successes and lessons learned provide a framework and pathway for similar institutions to mitigate the ongoing impacts of COVID-19 and sustain operations during a global pandemic.
Energy deficit is common during prolonged periods of strenuous physical activity and limited sleep, but the extent to which appetite suppression contributes is unclear. The aim of this randomised crossover study was to determine the effects of energy balance on appetite and physiological mediators of appetite during a 72-h period of high physical activity energy expenditure (about 9·6 MJ/d (2300 kcal/d)) and limited sleep designed to simulate military operations (SUSOPS). Ten men consumed an energy-balanced diet while sedentary for 1 d (REST) followed by energy-balanced (BAL) and energy-deficient (DEF) controlled diets during SUSOPS. Appetite ratings, gastric emptying time (GET) and appetite-mediating hormone concentrations were measured. Energy balance was positive during BAL (18 (sd 20) %) and negative during DEF (–43 (sd 9) %). Relative to REST, hunger, desire to eat and prospective consumption ratings were all higher during DEF (26 (sd 40) %, 56 (sd 71) %, 28 (sd 34) %, respectively) and lower during BAL (–55 (sd 25) %, −52 (sd 27) %, −54 (sd 21) %, respectively; Pcondition < 0·05). Fullness ratings did not differ from REST during DEF, but were 65 (sd 61) % higher during BAL (Pcondition < 0·05). Regression analyses predicted hunger and prospective consumption would be reduced and fullness increased if energy balance was maintained during SUSOPS, and energy deficits of ≥25 % would be required to elicit increases in appetite. Between-condition differences in GET and appetite-mediating hormones identified slowed gastric emptying, increased anorexigenic hormone concentrations and decreased fasting acylated ghrelin concentrations as potential mechanisms of appetite suppression. Findings suggest that physiological responses that suppress appetite may deter energy balance from being achieved during prolonged periods of strenuous activity and limited sleep.
Clinical intuition suggests that personality disorders hinder the treatment of depression, but research findings are mixed. One reason for this might be the way in which current assessment measures conflate general aspects of personality disorders, such as overall severity, with specific aspects, such as stylistic tendencies. The goal of this study was to clarify the unique contributions of the general and specific aspects of personality disorders to depression outcomes.
Methods
Patients admitted to the Menninger Clinic, Houston, between 2012 and 2015 (N = 2352) were followed over a 6–8-week course of multimodal inpatient treatment. Personality disorder symptoms were assessed with the Structured Clinical Interview for Diagnostic and Statistical Manual of Mental Disorders, 4th edition Axis II Personality Screening Questionnaire at admission, and depression severity was assessed using the Patient Health Questionnaire-9 every fortnight. General and specific personality disorder factors estimated with a confirmatory bifactor model were used to predict latent growth curves of depression scores in a structural equation model.
Results
The general factor predicted higher initial depression scores but not different rates of change. By contrast, the specific borderline factor predicted slower rates of decline in depression scores, while the specific antisocial factor predicted a U shaped pattern of change.
Conclusions
Personality disorder symptoms are best represented by a general factor that reflects overall personality disorder severity, and specific factors that reflect unique personality styles. The general factor predicts overall depression severity while specific factors predict poorer prognosis which may be masked in prior studies that do not separate the two.
Hookworms are some of the most widespread of the soil-transmitted helminths (STH) with an estimated 438.9 million people infected. Until relatively recently Ancylostoma ceylanicum was regarded as a rare cause of hookworm infection in humans, with little public health relevance. However, recent advances in molecular diagnostics have revealed a much higher prevalence of this zoonotic hookworm than previously thought, particularly in Asia. This study examined the prevalence of STH and A. ceylanicum in the municipalities of Palapag and Laoang in the Philippines utilizing real-time polymerase chain reaction (PCR) on stool samples previously collected as part of a cross-sectional survey of schistosomiasis japonica. Prevalence of hookworm in humans was high with 52.8% (n = 228/432) individuals positive for any hookworm, 34.5% (n = 149/432) infected with Necator americanus, and 29.6% (n = 128/432) with Ancylostoma spp; of these, 34 were PCR-positive for A. ceylanicum. Considering dogs, 12 (n = 33) were PCR-positive for A. ceylanicum. This is the first study to utilize molecular diagnostics to identify A. ceylanicum in the Philippines with both humans and dogs infected. Control and elimination of this zoonotic hookworm will require a multifaceted approach including chemotherapy of humans, identification of animal reservoirs, improvements in health infrastructure, and health education to help prevent infection.
The otoliths (ear stones) of fishes are commonly used to describe the age and growth of marine and freshwater fishes. These non-skeletal structures are fortuitous in their utility by being composed of mostly inorganic carbonate that is inert through the life of the fish. This conserved record functions like an environmental chronometer and bomb-produced radiocarbon (14C)—a 14C signal created by atmospheric testing of thermonuclear devices—can be used as a time-specific marker in validating fish age. However, complications from the hydrogeology of nearshore marine environments can complicate 14C levels, as was the case with gray snapper (Lutjanus griseus) along the Gulf of Mexico coast of Florida. Radiocarbon of these nearshore waters is influenced by freshwater input from the karst topography of the Upper Floridan Aquifer—estuarine waters that are 14C-depleted from surface and groundwater inputs. Some gray snapper likely recruited to this kind of environment where 14C levels were depleted in the earliest otolith growth, although age was validated for individuals that were not exposed to 14C-depleted waters to an age of at least 25 years with support for a 30-year lifespan.
Although the Peritraumatic Distress Inventory (PDI) and Peritraumatic Dissociative Experiences Questionnaire (PDEQ) are both useful for identifying adults at risk of developing acute and chronic post-traumatic stress disorder (PTSD), they have not been validated in school-aged children. The present study aims at assessing the psychometric properties of the PDI and PDEQ in a sample of French-speaking school children.
Methods
One-hundred and thirty-three school-aged victims of road traffic accidents were consecutively enrolled into this study via the emergency room. Mean(SD) age was 11.7(2.2) and 56.4% (n=75) of them were of male gender. The 13-item self-report PDI (range 0-52) and the 10-item self report PDEQ (range 10-50) were assessed within one week of the accident. Symptoms of PTSD were assessed 1 and 6 months later using the 20-item self-report Child Post-Traumatic Stress Reaction Index (CPTS-RI) (range 0-80).
Results
Mean(SD) PDI and PDEQ scores were 19.1(10.1) and 21.1(7.6), respectively, while mean(SD) CPTS-RI scores at 1- and 6-months were 22.6(12.4) and 20.6(13.5), respectively. Cronbach's alpha coefficients were 0.8 and 0.77 for the PDI and PDEQ, respectively. The 1-month test-retest correlation coefficient (n=33) was 0.77 for both measures. The PDI demonstrated a 2-factor structure while the PDEQ displayed a 1-factor structure. As with adults, the two measures were inter-correlated (r=0.52) and correlated with subsequent PTSD symptoms (r=0.21−0.56; p< 0.05).
Conclusions
The PDI and PDEQ are reliable and valid in school-aged children, and predict PTSD symptoms.
It remains unknown whether peritraumatic reactions predict PTSD symptoms in younger populations. To prospectively investigated the power of self-reported peritraumatic distress and dissociation to predict the development of PTSD symptoms at 1-month in school-aged children.
Methods
A sample of 103 school-aged children (8-15 years old) admitted to an Emergency Department after a road traffic accident were consecutively enrolled. Peritraumatic distress was assessed using the Peritraumatic Distress Inventory (range 0-52) and peritraumatic dissociation was assessed using the Peritraumatic Dissociative Experiences Questionnaire (PDEQ) (range 10-50). PTSD symptoms were measured at 1-month by both the child version of the clinician-administered PTSD Scale (CAPS-CA) (range: 0-136) and the Child Post-traumatic Stress Reaction Index (CPTS-RI) (range 0-80).
Results
Mean(SD) participants’ age was 11.7(2.2) and 53.4% (n=55) of them were of male gender. At baseline, mean PDI and PDEQ scores were 21.4 (SD=7.8) and 19.2 (SD=10.2), respectively. At 1-month, mean self-reported (CPTS-RI) and interviewer-based (CAPS-CA) PTSD symptom scores were 23.2 (SD=12.1) and 19 (SD=16.9), respectively. According to the CAPS-CA, 5 children (4.9%) suffered from full PTSD. Bivariate analyses demonstrated a significant association between peritraumatic variables (PDI and PDEQ) and both CAPS-CA and CPTS-RI (r=0.22-0.57; all p< 0.05). However, in a multivariate analysis, PDI was the only significant predictor of acute PTSD symptoms (Beta=0.33, p< 0.05).
Conclusion
As has been found in adults, peritraumatic distress is a robust predictor of who will develop PTSD symptoms among school-aged children.
Agitation is a common presentation among patients hospitalized for an acute exacerbation of schizophrenia. Rapid and effective control of agitation is an important early treatment goal.
Objectives:
The aim of this post-hoc analysis was to evaluate the efficacy of lurasidone in reducing agitation in patients with an acute exacerbation of schizophrenia.
Methods:
The analysis was performed on pooled data from 5 six-week, placebo-controlled trials in the subgroup of patients with an acute exacerbation of schizophrenia who met (n=773), or did not meet (n=754), criteria) for agitation (PANSS-Excited Component [EC] score ≥14 at baseline, Citrome, J Clin Psych 2007;68:1876-1885). Patients were randomized to fixed once-daily doses of lurasidone (40-160 mg).
Results:
The mean baseline PANSS-EC scores were similar for lurasidone vs. placebo in the high (16.7 vs. 16.8) and low (10.9 vs. 10.7) agitation subgroups. In the high agitation subgroup, treatment with lurasidone (vs. placebo) was associated with significantly greater improvement in PANSS-EC scores at days 3/4 (-2.0 vs. -1.3; p<0.001) and day 7 (-2.6 vs. -1.8; p<0.001). At week 6 endpoint, improvement on lurasidone vs. placebo was greater in the high vs. low agitation groups on the PANSS-EC score (effect size, 0.43 vs. 0.31), and comparable on the PANSS total score (effect size, 0.57 vs. 0.58).
Conclusions:
In this pooled post-hoc analysis, treatment with lurasidone significantly reduced agitation by day 3/4 in patients hospitalized with an acute exacerbation of schizophrenia. The magnitude of improvement at week 6 was similar in both the high and low agitation groups.
Basic Self disturbances (BSD), including changes of the 'pre-reflexive' sense of self and the loss first-person perspective, are characteristic of the schizophrenic spectrum disorders and highly prevalent in subjects at 'ultra high risk' for psychosis (UHR). The current literature indicates that cortical midline structures (CMS) may be implicated in the neurobiological substrates of the 'basic self' in healthy controls.
Objectives
Neuroanatomical investigation of BSD in a UHR sample
Aims
To test the hypotheses :(i) UHR subjects have higher 'Examination of Anomalous Self Experience, EASE' scores as compared to controls, (ii) UHR subjects have neuroanatomical alterations as compared to controls in CMS, (iii) within UHR subjects, EASE scores are directly related to structural CMS alterations.
Methods
32 HR subjects (27 antipsychotics-naïve) and 17 healthy controls (HC) were assessed with the 57-items semi-structured EASE interview. Voxel-Based Morphometry (VBM) was conducted in the same subjects, with a-priori Region of Interests (ROIs) defined in the CMS (anterior/posterior cingulate and medial-prefrontal cortex).
Results
Despite high variability in the HR group, the overall EASE score was higher (t-test >0.01, Cohen's d =2.91) in HR (mean=30.15, SD=16.46) as compared to HC group (mean=1.79, SD=2.83). UHR subjects had gray matter reduction in CMS as compared to HC (p>0.05 FWE-corrected). Across the whole sample, lower gray matter volume in the anterior cingulate was correlated with higher EASE scores (p>0.05).
Conclusions
This study provides preliminary evidence that gray matter reductions in the CMS are correlated with BSD in UHR people.
Yukon Territory (YT) is a remote region in northern Canada with ongoing spread of tuberculosis (TB). To explore the utility of whole genome sequencing (WGS) for TB surveillance and monitoring in a setting with detailed contact tracing and interview data, we used a mixed-methods approach. Our analysis included all culture-confirmed cases in YT (2005–2014) and incorporated data from 24-locus Mycobacterial Interspersed Repetitive Units-Variable Number of Tandem Repeats (MIRU-VNTR) genotyping, WGS and contact tracing. We compared field-based (contact investigation (CI) data + MIRU-VNTR) and genomic-based (WGS + MIRU-VNTR + basic case data) investigations to identify the most likely source of each person's TB and assessed the knowledge, attitudes and practices of programme personnel around genotyping and genomics using online, multiple-choice surveys (n = 4) and an in-person group interview (n = 5). Field- and genomics-based approaches agreed for 26 of 32 (81%) cases on likely location of TB acquisition. There was less agreement in the identification of specific source cases (13/22 or 59% of cases). Single-locus MIRU-VNTR variants and limited genetic diversity complicated the analysis. Qualitative data indicated that participants viewed genomic epidemiology as a useful tool to streamline investigations, particularly in differentiating latent TB reactivation from the recent transmission. Based on this, genomic data could be used to enhance CIs, focus resources, target interventions and aid in TB programme evaluation.
To assess the impact of a newly developed Central-Line Insertion Site Assessment (CLISA) score on the incidence of local inflammation or infection for CLABSI prevention.
Design:
A pre- and postintervention, quasi-experimental quality improvement study.
Setting and participants:
Adult inpatients with central venous catheters (CVCs) hospitalized in an intensive care unit or oncology ward at a large academic medical center.
Methods:
We evaluated CLISA score impact on insertion site inflammation and infection (CLISA score of 2 or 3) incidence in the baseline period (June 2014–January 2015) and the intervention period (April 2015–October 2017) using interrupted times series and generalized linear mixed-effects multivariable analyses. These were run separately for days-to-line removal from identification of a CLISA score of 2 or 3. CLISA score interrater reliability and photo quiz results were evaluated.
Results:
Among 6,957 CVCs assessed 40,846 times, percentage of lines with CLISA score of 2 or 3 in the baseline and intervention periods decreased by 78.2% (from 22.0% to 4.7%), with a significant immediate decrease in the time-series analysis (P < .001). According to the multivariable regression, the intervention was associated with lower percentage of lines with a CLISA score of 2 or 3, after adjusting for age, gender, CVC body location, and hospital unit (odds ratio, 0.15; 95% confidence interval, 0.06–0.34; P < .001). According to the multivariate regression, days to removal of lines with CLISA score of 2 or 3 was 3.19 days faster after the intervention (P < .001). Also, line dwell time decreased 37.1% from a mean of 14 days (standard deviation [SD], 10.6) to 8.8 days (SD, 9.0) (P < .001). Device utilization ratios decreased 9% from 0.64 (SD, 0.08) to 0.58 (SD, 0.06) (P = .039).
Conclusions:
The CLISA score creates a common language for assessing line infection risk and successfully promotes high compliance with best practices in timely line removal.
Consanguineous marriages potentially play an important role in the transmission of β-thalassaemia in many communities. This study aimed to determine the rate and socio-demographic associations of consanguineous marriages and to assess the influence on the prevalence of β-thalassaemia in Sri Lanka. Three marriage registrars from each district of Sri Lanka were randomly selected to prospectively collect data on all couples who registered their marriage during a 6-month period starting 1st July 2009. Separately, the parents of patients with β-thalassaemia were interviewed to identify consanguinity. A total of 5255 marriages were recorded from 22 districts. The average age at marriage was 27.3 (±6.1) years for males and 24.1 (±5.7) years for females. A majority (71%) of marriages were ‘love’ marriages, except in the Moor community where 84% were ‘arranged’ marriages. Overall, the national consanguinity rate was 7.4%. It was significantly higher among ethnic Tamils (22.4%) compared with Sinhalese (3.8%) and Moors (3.2%) (p < 0.001). Consanguinity rates were also higher in ‘arranged’ as opposed to ‘love’ marriages (11.7% vs 5.6%, p < 0.001). In patients with β-thalassaemia, the overall consanguinity rate was 14.5%; it was highest among Tamils (44%) and lowest among Sinhalese (12%). Parental consanguinity among patients with β-thalassaemia was double the national average. Although consanguinity is not the major factor in the transmission of the disease in the country, emphasis should be given to this significant practice when conducting β-thalassaemia prevention and awareness campaigns, especially in high-prevalence communities.