We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Hypertensive heart disease and hypertrophic cardiomyopathy both lead to left ventricular hypertrophy despite differing in aetiology. Elucidating the correct aetiology of the presenting hypertrophy can be a challenge for clinicians, especially in patients with overlapping risk factors. Furthermore, drugs typically used to combat hypertensive heart disease may be contraindicated for the treatment of hypertrophic cardiomyopathy, making the correct diagnosis imperative. In this review, we discuss characteristics of both hypertensive heart disease and hypertrophic cardiomyopathy that may enable clinicians to discriminate the two as causes of left ventricular hypertrophy. We summarise the current literature, which is primarily focused on adult populations, containing discriminative techniques available via diagnostic modalities such as electrocardiography, echocardiography, and cardiac MRI, noting strategies yet to be applied in paediatric populations. Finally, we review pharmacotherapy strategies for each disease with regard to pathophysiology.
Trace amine-associated receptor 1 (TAAR1) agonists offer a new approach, but there is uncertainty regarding their effects, exact mechanism of action and potential role in treating psychosis.
Aims
To evaluate the available evidence on TAAR1 agonists in psychosis, using triangulation of the output of living systematic reviews (LSRs) of animal and human studies, and provide recommendations for future research prioritisation.
Method
This study is part of GALENOS (Global Alliance for Living Evidence on aNxiety, depressiOn and pSychosis). In the triangulation process, a multidisciplinary group of experts, including those with lived experience, met and appraised the first co-produced living systematic reviews from GALENOS, on TAAR1 agonists.
Results
The animal data suggested a potential antipsychotic effect, as TAAR1 agonists reduced locomotor activity induced by pro-psychotic drug treatment. Human studies showed few differences for ulotaront and ralmitaront compared with placebo in improving overall symptoms in adults with acute schizophrenia (four studies, n = 1291 participants, standardised mean difference (SMD) 0.15, 95% CI −0.05 to 0.34). Large placebo responses were seen in ulotaront phase three trials. Ralmitaront was less efficacious than risperidone (one study, n = 156 participants, SMD = −0.53, 95% CI −0.86 to −0.20). The side-effect profile of TAAR1 agonists was favourable compared with existing antipsychotics. Priorities for future studies included (a) using different animal models of psychosis with greater translational validity; (b) animal and human studies with wider outcomes including cognitive and affective symptoms and (c) mechanistic studies and investigations of other potential applications, such as adjunctive treatments and long-term outcomes. Recommendations for future iterations of the LSRs included (a) meta-analysis of individual human participant data, (b) including studies that used different methodologies and (c) assessing other disorders and symptoms.
Conclusions
This co-produced, international triangulation examined the available evidence and developed recommendations for future research and clinical applications for TAAR1 agonists in psychosis. Broader challenges included difficulties in assessing the risk of bias, reproducibility, translation and interpretability of animal models to clinical outcomes, and a lack of individual and clinical characteristics in the human data. The research will inform a separate, independent prioritisation process, led by lived experience experts, to prioritise directions for future research.
The March 2, 2022, United Nations Environment Assembly Resolution 5/14: “End plastic pollution: Toward an international legally binding instrument by 2024” provides an important path for addressing global plastic pollution, from monomer design and production through the value chain to the final fate of plastic products, including resource recovery. Of the goals set for this effort, simplifying the polymer and additive universe is among the most significant. One primary obstacle to resource recovery from plastic waste is polymer variability, which renders post-use plastic inherently waste-like. While simplification will not address microplastics and leaching of chemicals during use, these measures simplify the plastic universe and mitigate leakage which is critical to ensuring circular plastic use. This study provides a pathway for simplification of formulations through the elimination of problematic additives and revealing paths toward simplifying and reducing the variability in polymers, waste streams and pollution, while preserving critical uses. This study focuses on phenolic antioxidants to support this concept; however, these principles can be applied to other additive classes. The results show extensive duplication of chemical species with different trade names and the appearance of only minor changes to species with the intention of evergreening patents for improved marketability.
Background: Nursing home (NH) residents are at high risk of COVID-19 from exposure to infected staff and other residents. Understanding SARS-CoV-2 viral RNA kinetics in residents and staff can guide testing, isolation, and return to work recommendations. We sought to determine the duration of antigen test and polymerase chain reaction (PCR) positivity in a cohort of NH residents and staff. Methods: We prospectively collected data on SARS-CoV-2 viral kinetics from April 2023 through November 2023. Staff and residents could enroll prospectively or upon a positive test (identified through routine clinical testing, screening, or outbreak response testing). Participating facilities performed routine clinical testing; asymptomatic testing of contacts was performed within 48 hours if an outbreak or known exposure occurred and upon (re-) admission. Enrolled participants who tested positive for SARS-CoV-2 were re-tested daily for 14 days with both nasal antigen and nasal PCR tests. All PCR tests were run by a central lab with the same assay. We conducted a Kaplan-Meier survival analysis on time to first negative test restricted to participants who initially tested positive (day zero) and had at least one test ≥10 days after initially testing positive with the same test type; a participant could contribute to both antigen and PCR survival curves. We compared survival curves for staff and residents using the log-rank test. Results: Twenty-four nursing homes in eight states participated; 587 participants (275 residents, 312 staff) enrolled in the evaluation, participants were only tested through routine clinical or outbreak response testing. Seventy-two participants tested positive for antigen; of these, 63 tested PCR-positive. Residents were antigen- and PCR-positive longer than staff (Figure 1), but this finding is only statistically significant (p=0.006) for duration of PCR positivity. Five days after the first positive test, 56% of 50 residents and 59% of 22 staff remained antigen-positive; 91% of 44 residents and 79% of 19 staff were PCR-positive. Ten days after the first positive test, 22% of 50 residents and 5% of 22 staff remained antigen-positive; 61% of 44 residents and 21% of 19 staff remained PCR-positive. Conclusions: Most NH residents and staff with SARS-CoV-2 remained antigen- or PCR-positive 5 days after the initial positive test; however, differences between staff and resident test positivity were noted at 10 days. These data can inform recommendations for testing, duration of NH resident isolation, and return to work guidance for staff. Additional viral culture data may strengthen these conclusions.
Disclosure: Stefan Gravenstein: Received consulting and speaker fees from most vaccine manufacturers (Sanofi, Seqirus, Moderna, Merck, Janssen, Pfizer, Novavax, GSK, and have or expect to receive grant funding from several (Sanofi, Seqirus, Moderna, Pfizer, GSK). Lona Mody: NIH, VA, CDC, Kahn Foundation; Honoraria: UpToDate; Contracted Research: Nano-Vibronix
Social connection is associated with better health, including reduced risk of dementia. Personality traits are also linked to cognitive outcomes; neuroticism is associated with increased risk of dementia. Personality traits and social connection are also associated with each other. Taken together, evidence suggests the potential impacts of neuroticism and social connection on cognitive outcomes may be linked. However, very few studies have simultaneously examined the relationships between personality, social connection and health.
Research objective:
We tested the association between neuroticism and cognitive measures while exploring the potential mediating roles of aspects of social connection (loneliness and social isolation).
Method:
We conducted a cross-sectional study with a secondary analysis of the Canadian Longitudinal Study on Aging (CLSA) Comprehensive Cohort, a sample of Canadians aged 45 to 85 years at baseline. We used only self-reported data collected at the first follow-up, between 2015 and 2018 (n= 27,765). We used structural equation modelling to assess the association between neuroticism (exposure) and six cognitive measures (Rey Auditory Verbal Learning Test immediate recall and delayed recall, Animal Fluency Test, Mental Alternation Test, Controlled Oral Word Association Test and Stroop Test interference ratio), with direct and indirect effects (through social isolation and loneliness). We included age, education and hearing in the models and stratified all analyses by sex, females (n= 14,133) and males (n=13,632).
Preliminary results of the ongoing study:
We found positive, statistically significant associations between neuroticism and social isolation (p<0.05) and loneliness (p<0.05), for both males and females. We also found inverse, statistically significant associations between neuroticism and all cognitive measures (p<0.05), except the Stroop Test interference ratio. In these models, there was consistent evidence of indirect effects (through social isolation and loneliness) and, in some cases, evidence of direct effects. We found sex differences in the model results.
Conclusion:
Our findings suggest that the association between neuroticism and cognitive outcomes may be mediated by aspects of social connection and differ by sex. Understanding if and how modifiable risk factors mediate the association between personality and cognitive outcomes would help develop and target intervention strategies that improve social connection and brain health.
Depression has a well-established negative effect on cognitive functioning. Variations in the apolipoprotein e (APOE) and brain-derived neurotrophic factor (BDNF) genes likely contribute to this relationship. APOE4 and the BDNF Val66Met polymorphism are independently associated with late-life depression and cognitive dysfunction. The current study investigated the moderating effects of APOE4 and BDNFMet (i.e., the presence of the BDNF Val66Met polymorphism) on the relationship between depression and cognitive functioning in older adults.
Participants and Methods:
The sample included 103 older adults drawn from two clinical trials who were recruited from the VA Palo Alto Health Care System (VAPAHCS) and the Stanford/VA Alzheimer’s Disease Center. Depression was diagnosed using the Mini Neuropsychiatric Interview for the Diagnostic and Statistical Manual of Mental Disorders-IV (DSM-IV). The presence of an APOE4 and BDNFMet allele were dichotomized (i.e., yes/no) and determined using venipuncture. A comprehensive neuropsychological battery was used to assess attention (RAVLT Trial 1, WAIS-IV DSF), processing speed (TMTA, SDMT, Stroop Word, Stroop Color), working memory (WAIS-IV DSB, DSS), visuospatial functioning (JLO), language (VNT), memory (RAVLT Delayed Recall, WMS-IV Logical Memory II), and executive function (TMTB, Stroop Color-Word). Separate moderation analyses were conducted with depression as the predictor and APOE4 or BDNFMet status as the moderator using the SPSS PROCESS macro v4.0. Age was a covariate for models with processing speed, memory, language, and executive function as outcome variables.
Results:
Participants were largely male (93%) and White (75%). Ten percent met criteria for depression, 26% were APOE4 carriers, and 32% were BDNFMet carriers. The overall model examining depression, APOE4, and memory was significant (p < .01, R2 = .14). Depression was associated with lower memory performance (p < .05), however, APOE4 was not a significant moderator (p > .05). Similarly, the overall model examining depression, APOE4, and language was also significant (p < .05, R2 = .10). While the direct effects of depression and APOE4 on language were nonsignificant (p > .05), there was a significant two-way interaction between APOE4 and depression (p = .03). The overall model with depression, BDNFMet, and memory was significant (p < .001, R2 = .18). While neither depression nor BDNFMet had significant direct effects on memory (p > .05), a two-way interaction emerged between depression and BDNFMet (p = .05). Simple slopes analyses were used to further investigate significant interactions. Depression, APOE4, and BDNFMet did not significantly impact attention, processing speed, working memory, visuospatial functioning, or executive function, and no significant interactions were noted among variables. BDNFMet had no direct impact on language.
Conclusions:
APOE4 and BDNFMet were found to differentially moderate the relationship between depression and cognition. Specifically, APOE4 carriers with depression had worse language performance compared to those who were healthy, depressed, or APOE4 carriers. BDNFMet carriers with depression performed worse on measures of memory compared to those who were healthy, depressed, or BDNFMet carriers. The treatment of depression in APOE4 and BDNFMet carriers may reduce associated cognitive impairments. Limitations and future implications are also discussed.
Methamphetamine and cannabis are two widely used substances with possibly opposing effects on aspects of central nervous system functioning. Use of these substances is prevalent among people with HIV (PWH), though their combined effects on HIV-associated neurocognitive impairment (NCI) are unknown. Adverse effects of methamphetamine use on cognition are well documented. Cannabis may disturb cognition acutely, though its longer-term effects in PWH are not well understood. Our prior analysis of people without HIV (PWoH) found that cotemporaneous cannabis use was associated with better neurocognitive outcomes among methamphetamine users. The aim of this study was to assess how lifetime cannabis and methamphetamine use disorder relate to neurocognitive outcomes in PWH.
Participants and Methods:
HIV-positive participants (n=472) were on average 45.6±11.5 years of age, male (86.4%), White (60.6%), and educated 13.9±2.5 years. Most participants were on ART (81.9%) and virally suppressed (70%). Participants were stratified by lifetime methamphetamine (M-/M+) and cannabis (C-/C+) DSM-IV abuse/dependence disorder into four groups: M-C- (n=187), M-C+ (n=68), M+C-, (n=82) and M+C+ (n=135) and completed a comprehensive neurobehavioral assessment. Demographically corrected T-scores and deficit scores were used for analyses. Group differences in global and domain NC performances (i.e., T-scores) were examined using multiple linear regression, holding constant covariates that were associated with study groups and/or cognition. Specifically, M+ participants displayed higher rates of Hepatitis C infection (p=.004), higher current depressive symptom scores (p<.001), and higher rates of detectable plasma HIV RNA (p=.014). Multiple logistic regression was used to test for group differences in probability of neurocognitive impairment (i.e., deficit scores>0.5), including the same covariates. Pooling data with a sample of HIV-negative participants (n=423), we used generalized linear mixed effect models to examine how neurocognitive performance and impairment profiles varied by methamphetamine and/or cannabis use group, HIV disease characteristics, and their interactions.
Results:
Compared to M+C+, M+C- performed worse on measures of executive functions (ß=-3.17), learning (ß=-3.95), memory (ß=-5.58), and working memory (ß=-4.05) and were more likely to be classified as impaired in the learning (OR=2.93), memory (OR=5.24), and working memory (OR=2.48) domains. M-C- performed better than M+C+ on measures of learning (ß=3.46) and memory (ß=5.19), but worse than M-C+ on measures of executive functions (ß=-3.90), learning (ß=-3.32), memory (ß=-3.38), and working memory (ß=-3.38). Generalized linear mixed effect models indicate that detectable plasma HIV RNA (ß=-1.85) and low nadir CD4 T-cell counts (nadir CD4<200; ß=-1.07) were associated with worse neurocognitive performance, and these effects did not differ in size or direction by substance use group.
Conclusions:
In PWH, lifetime methamphetamine use disorder and both current and legacy markers of HIV disease severity are associated with worse neurocognitive outcomes. Cannabis use disorder does not appear to exacerbate methamphetamine-related deficits in PWH. Instead, results are consistent with findings from preclinical studies that cannabis use may protect against methamphetamine’s deleterious effects. Profile analysis models showed that participants with a history of cannabis use disorder display better overall neurocognitive performance than comparison (M-C-) participants. Mechanisms underlying a potential protective effect of cannabis may be elucidated by examining the temporal relationship between cannabis and methamphetamine consumption and neurocognitive performance.
Methamphetamine and cannabis are two widely used, and frequently co-used, substances with possibly opposing effects on the central nervous system. Evidence of neurocognitive deficits related to use is robust for methamphetamine and mixed for cannabis. Findings regarding their combined use are inconclusive. We aimed to compare neurocognitive performance in people with lifetime cannabis or methamphetamine use disorder diagnoses, or both, relative to people without substance use disorders.
Method:
423 (71.9% male, aged 44.6 ± 14.2 years) participants, stratified by presence or absence of lifetime methamphetamine (M−/M+) and/or cannabis (C−/C+) DSM-IV abuse/dependence, completed a comprehensive neuropsychological, substance use, and psychiatric assessment. Neurocognitive domain T-scores and impairment rates were examined using multiple linear and binomial regression, respectively, controlling for covariates that may impact cognition.
Results:
Globally, M+C+ performed worse than M−C− but better than M+C−. M+C+ outperformed M+C− on measures of verbal fluency, information processing speed, learning, memory, and working memory. M−C+ did not display lower performance than M−C− globally or on any domain measures, and M−C+ even performed better than M−C− on measures of learning, memory, and working memory.
Conclusions:
Our findings are consistent with prior work showing that methamphetamine use confers risk for worse neurocognitive outcomes, and that cannabis use does not appear to exacerbate and may even reduce this risk. People with a history of cannabis use disorders performed similarly to our nonsubstance using comparison group and outperformed them in some domains. These findings warrant further investigation as to whether cannabis use may ameliorate methamphetamine neurotoxicity.
Background: Long-term care facility (LTCF) employees pose potential risk for COVID-19 outbreaks. Association between employee infection prevention (IP) adherence with facility COVID-19 outbreaks remains a knowledge gap. Methods: From April through December 2020, prior to COVID-19 vaccination, we tested asymptomatic Veterans’ Affairs (VA) community living center (CLC) residents twice weekly and employees monthly, which increased to weekly with known exposure, for SARS-CoV-2 via nasopharyngeal PCR. Employees voluntarily completed multiple choice questionnaires assessing self-reported IP adherence at and outside work. Surveys were longitudinally administered in April, June, July, and October 2020. Changes in paired employee responses for each period were analyzed using the McNemar test. We obtained COVID-19 community rates from surrounding Davidson and Rutherford counties from the Tennessee Department of Health public data set. CLC resident COVID-19 cases were obtained from VA IP data. Incidence rate and number of positive tests were calculated. Results: Between April and December 2020, 444 employees completed at least 1 survey; 177 completed surveys in both April and June, 179 completed surveys in both June and July, and 140 completed surveys in both July and October (Fig. 1). Across periods, employee surveys demonstrated an increase in masking at work and outside work between April and June (63% to 95% [P < .01] and 36% to 63% [P < .01], respectively), and June to July (95% to 99% [P < .05] and 71% to 84% [P < .01], respectively) that were both maintained between July and October (Fig. 2). Distancing at work and limiting social contacts outside work significantly decreased from April to June but increased in subsequent periods, although not significantly. COVID-19 community incidence peaked in July and again in December, but CLC resident COVID-19 cases peaked in August, declined, and remained low through December (Fig. 3). Discussion: Wearing a mask at work, which was mandatory, increased, and voluntary employee masking outside work also increased. CLC COVID-19 cases mirrored community increases in July and August; however, community cases increased again later in 2020 while CLC cases remained low. Employees reporting distancing at work and limiting social contacts outside work decreased preceding the initial rise in CLC cases but increased and remained high after July. Conclusions: These data from the pre–COVID-19 vaccination era suggest that widespread, increased support for and emphasis on LTCF IP adherence, especially masking, may have effectively prevented COVID-19 outbreaks in the vulnerable LTCF population.
Bloodstream infections (BSIs) are a frequent cause of morbidity in patients with acute myeloid leukemia (AML), due in part to the presence of central venous access devices (CVADs) required to deliver therapy.
Objective:
To determine the differential risk of bacterial BSI during neutropenia by CVAD type in pediatric patients with AML.
Methods:
We performed a secondary analysis in a cohort of 560 pediatric patients (1,828 chemotherapy courses) receiving frontline AML chemotherapy at 17 US centers. The exposure was CVAD type at course start: tunneled externalized catheter (TEC), peripherally inserted central catheter (PICC), or totally implanted catheter (TIC). The primary outcome was course-specific incident bacterial BSI; secondary outcomes included mucosal barrier injury (MBI)-BSI and non-MBI BSI. Poisson regression was used to compute adjusted rate ratios comparing BSI occurrence during neutropenia by line type, controlling for demographic, clinical, and hospital-level characteristics.
Results:
The rate of BSI did not differ by CVAD type: 11 BSIs per 1,000 neutropenic days for TECs, 13.7 for PICCs, and 10.7 for TICs. After adjustment, there was no statistically significant association between CVAD type and BSI: PICC incident rate ratio [IRR] = 1.00 (95% confidence interval [CI], 0.75–1.32) and TIC IRR = 0.83 (95% CI, 0.49–1.41) compared to TEC. When MBI and non-MBI were examined separately, results were similar.
Conclusions:
In this large, multicenter cohort of pediatric AML patients, we found no difference in the rate of BSI during neutropenia by CVAD type. This may be due to a risk-profile for BSI that is unique to AML patients.
Understand how the built environment can affect safety and efficiency outcomes during doffing of personal protective equipment (PPE) in the context of coronavirus disease 2019 (COVID-19) patient care.
Study design:
We conducted (1) field observations and surveys administered to healthcare workers (HCWs) performing PPE doffing, (2) focus groups with HCWs and infection prevention experts, and (3) a with healthcare design experts.
Settings:
This study was conducted in 4 inpatient units treating patients with COVID-19, in 3 hospitals of a single healthcare system.
Participants:
The study included 24 nurses, 2 physicians, 1 respiratory therapist, and 2 infection preventionists.
Results:
The doffing task sequence and the layout of doffing spaces varied considerably across sites, with field observations showing most doffing tasks occurring around the patient room door and PPE support stations. Behaviors perceived as most risky included touching contaminated items and inadequate hand hygiene. Doffing space layout and types of PPE storage and work surfaces were often associated with inadequate cleaning and improper storage of PPE. Focus groups and the design charrette provided insights on how design affording standardization, accessibility, and flexibility can support PPE doffing safety and efficiency in this context.
Conclusions:
There is a need to define, organize and standardize PPE doffing spaces in healthcare settings and to understand the environmental implications of COVID-19–specific issues related to supply shortage and staff workload. Low-effort and low-cost design adaptations of the layout and design of PPE doffing spaces may improve HCW safety and efficiency in existing healthcare facilities.
The rapid growth in web-based grocery food purchasing has outpaced federal regulatory attention to the online provision of nutrition and allergen information historically required on food product labels. We sought to characterise the extent and variability that online retailers disclose required and regulated information and identify the legal authorities for the federal government to require online food retailers to disclose such information.
Design:
We performed a limited scan of ten products across nine national online retailers and conducted legal research using LexisNexis to analyse federal regulatory agencies’ authorities.
Setting:
USA.
Participants:
N/A.
Results:
The scan of products revealed that required information (Nutrition Facts Panels, ingredient lists, common food allergens and per cent juice for fruit drinks) was present, conspicuous and legible for an average of only 36·5 % of the products surveyed, ranging from 11·4 % for potential allergens to 54·2 % for ingredients lists. More commonly, voluntary nutrition-related claims were prominently and conspicuously displayed (63·5 % across retailers and products). Our legal examination found that the Food and Drug Administration, Federal Trade Commission and United States Department of Agriculture have existing regulatory authority over labelling, online sales and advertising, and Supplemental Nutrition Assistance Programme retailers that can be utilised to address deficiencies in the provision of required information in the online food retail environment.
Conclusions:
Information regularly provided to consumers in conventional settings is not being uniformly provided online. Congress or the federal agencies can require online food retailers disclose required nutrition and allergen information to support health, nutrition, equity and informed consumer decision-making.
Jennifer Morgan describes how civil society worked to secure a fair and ambitious multilateral agreement as seen from her perspective as Global Director of the Climate Program at the World Resources Institute at the time. In her view, civil society had four different roles and functions: civil society (1) gathered together idea generators, analysts, and researchers ahead of time, (2) provided informal intelligence and diplomatic service combined with high-trust networks, (3) brought the voices of people into the negotiations, and (4) explained COP 21’s complexities and alerted the outside world. Morgan concludes that no single function or role of civil society made the difference in Paris, but the combination of them all did so. She highlights the “bursting of the UNFCCC bubble” as a main function and assesses that the 1.5°C goal, came into the Paris Agreement as a result of peoples’ voices being heard and listened to. Without the NGO contribution, it would have been much more difficult, perhaps impossible, to sow the seeds of change. Morgan suggests that civil society can use its principled and objective-based approach to its advantage and listen to allies in their movement.
Recent cannabis exposure has been associated with lower rates of neurocognitive impairment in people with HIV (PWH). Cannabis’s anti-inflammatory properties may underlie this relationship by reducing chronic neuroinflammation in PWH. This study examined relations between cannabis use and inflammatory biomarkers in cerebrospinal fluid (CSF) and plasma, and cognitive correlates of these biomarkers within a community-based sample of PWH.
Methods:
263 individuals were categorized into four groups: HIV− non-cannabis users (n = 65), HIV+ non-cannabis users (n = 105), HIV+ moderate cannabis users (n = 62), and HIV+ daily cannabis users (n = 31). Differences in pro-inflammatory biomarkers (IL-6, MCP-1/CCL2, IP-10/CXCL10, sCD14, sTNFR-II, TNF-α) by study group were determined by Kruskal–Wallis tests. Multivariable linear regressions examined relationships between biomarkers and seven cognitive domains, adjusting for age, sex/gender, race, education, and current CD4 count.
Results:
HIV+ daily cannabis users showed lower MCP-1 and IP-10 levels in CSF compared to HIV+ non-cannabis users (p = .015; p = .039) and were similar to HIV− non-cannabis users. Plasma biomarkers showed no differences by cannabis use. Among PWH, lower CSF MCP-1 and lower CSF IP-10 were associated with better learning performance (all ps < .05).
Conclusions:
Current daily cannabis use was associated with lower levels of pro-inflammatory chemokines implicated in HIV pathogenesis and these chemokines were linked to the cognitive domain of learning which is commonly impaired in PWH. Cannabinoid-related reductions of MCP-1 and IP-10, if confirmed, suggest a role for medicinal cannabis in the mitigation of persistent inflammation and cognitive impacts of HIV.
To develop an international template to support patient submissions in Health Technology Assessments (HTAs). This was to be based on the experience and feedback from the implementation and use of the Scottish Medicines Consortium's (SMC) Summary Information for Patient Groups (SIP).
Methods
To gather feedback on the SMC experience, web-based surveys were conducted with pharmaceutical companies and patient groups familiar with the SMC SIP. Semistructured interviews with representatives from HTA bodies were undertaken, along with patient group discussions with those less familiar with the SIP, to explore issues around the approach. These qualitative data informed the development of an international SIP template.
Results
Survey data indicated that 82 percent (18 of 22 respondents) of pharmaceutical company representatives felt that the SIP was worthwhile; 88 percent (15/17) of patient group respondents found the SIP helpful. Both groups highlighted the need for additional support and guidance around plain language summaries. Further suggestions included provision of a glossary of terms and cost-effectiveness information. Patient group interviews supported the survey findings and led to the development of a new template. HTA bodies raised potential challenges around buy-in, timing, and bias connected to the SIP approach.
Conclusions
The international SIP template is another approach to support deliberative processes in HTA. Although challenges remain around writing summaries for lay audiences, along with feasibility considerations for HTA bodies, the SIP approach should support more meaningful patient involvement in HTAs.
A total of 5478 fishes were sampled between 2009 and 2020 to assess length–weight, length–length and weight–weight relationships in 39 marine species from 10 families caught in the Seychelles waters by the artisanal fishery. Two types of length (total length TL, fork length FL) and three types of weight (whole weight WT, gutted weight GW and gilled-gutted weight GGW) were measured. The parameters of the relationships were estimated using the log-transformed allometric model with bias correction. Our results include length–weight, length–length and weight–weight relationships for 39, 20 and 18 species, respectively. Our length–weight data and resulting relationships were compared against FishBase database for 36 species and were in the Bayesian 95% confidence interval of the relationships available for 33 species and above for Gnathanodon speciosus, Lutjanus gibbus and Variola louti. Finally, for five abundant and widely dispersed species we tested for spatial differences in morphometric relationships between the Mahé Plateau and three southern atoll groups. Significant differences were found for two species only, but their magnitude was small. We thus argue for the regression relationships based on pooled data to be used for most types of population and community analyses. The availability of these morphometric relationships will support the application of accurate size-based analyses for Seychelles fisheries survey data, and so enhance understanding of the ecology of the reef-associated fish component of marine ecosystems and food webs, and improve fisheries research management.
Optical tracking systems typically trade off between astrometric precision and field of view. In this work, we showcase a networked approach to optical tracking using very wide field-of-view imagers that have relatively low astrometric precision on the scheduled OSIRIS-REx slingshot manoeuvre around Earth on 22 Sep 2017. As part of a trajectory designed to get OSIRIS-REx to NEO 101955 Bennu, this flyby event was viewed from 13 remote sensors spread across Australia and New Zealand to promote triangulatable observations. Each observatory in this portable network was constructed to be as lightweight and portable as possible, with hardware based off the successful design of the Desert Fireball Network. Over a 4-h collection window, we gathered 15 439 images of the night sky in the predicted direction of the OSIRIS-REx spacecraft. Using a specially developed streak detection and orbit determination data pipeline, we detected 2 090 line-of-sight observations. Our fitted orbit was determined to be within about 10 km of orbital telemetry along the observed 109 262 km length of OSIRIS-REx trajectory, and thus demonstrating the impressive capability of a networked approach to Space Surveillance and Tracking.
Following Canadian estimates of frailty, academic researchers and the Chiefs of Ontario came together to create the first Ontario-wide profile of aging in First Nations people in Ontario. Using self-reported data from First Nations adults who participated in the Ontario First Nations Regional Health Survey Phase 2, we found that First Nations people in Ontario experience higher rates of frailty than the general Canadian population and early onset frailty appears to affect First Nations communities. This is important to consider as communities plan for health care needs of an aging population and is particularly relevant in the face of Covid-19, as we know severity is exacerbated by underlying health conditions.
Young-onset dementia (YOD) is defined as the onset of dementia symptoms before the age of 65 years and accounts for 2–8% of dementia. YOD patients and their caregivers face unique challenges in diagnosis and management. We aimed to compare the characteristics of rural YOD and late-onset dementia (LOD) patients at a rural and remote memory clinic in Western Canada.
Methods:
A total of 333 consecutive patients (YOD = 61, LOD = 272) at a rural and remote memory clinic between March 2004 and July 2016 were included in this study. Each patient had neuropsychological assessment. Health, mood, function, behaviour and social factors were also measured. Both groups were compared using χ2 tests and independent sample tests.
Results:
YOD patients were more likely to be married, employed, current smokers and highly educated. They reported fewer cognitive symptoms, but had more depressive symptoms. YOD patients were less likely to live alone and use homecare services. YOD caregivers were also more likely to be a spouse and had higher levels of distress than LOD caregivers. Both YOD and LOD patient groups were equally likely to have a driver’s licence.
Conclusions:
Our findings indicate YOD and LOD patients have distinct characteristics and services must be modified to better meet YOD patient needs. In particular, the use of homecare services and caregiver support may alleviate the higher levels of distress found in YOD patients and their caregivers. Additional research should be directed to addressing YOD patient depression, caregiver distress and barriers to services.