We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Flowering rush (Butomus umbellatus L.) is an emergent perennial monocot that has invaded aquatic systems along the U.S.–Canadian border. Currently, there are two known cytotypes of flowering rush, diploid and triploid, within the invaded range. Although most studies have focused on the triploid cytotype, little information is known about diploid plants. Therefore, phenology and resource allocation were studied on the diploid cytotype of flowering rush in three study sites (Mentor Marsh, OH; Tonawanda Wildlife Management Area, NY; and Unity Island, NY) to understand seasonal resource allocation and environmental influences on growth, and to optimize management strategies. Samples were harvested once a month from May to November at each site from 2021 to 2023. Plant metrics were regressed to air temperature, water temperature, and water depth. Aboveground biomass peaked from July to September and comprised 50% to 70% of total biomass. Rhizome biomass peaked from September to November and comprised 40% to 50% of total biomass. Rhizome bulbil densities peaked from September to November at 3,000 to 16,000 rhizome bulbils m−2. Regression analysis resulted in strong negative relationships between rhizome starch content and air temperature (r2 = 0.52) and water temperature (r2 = 46). Other significant, though weak, relationships were found, including a positive relationship between aboveground biomass and air temperature (r2 = 0.17), a negative relationship between rhizome bulbil biomass and air temperature (r2 = 0.18) and a positive relationship between leaf density and air temperature (r2 = 0.17). Rhizomes and rhizome bulbils combined stored up to 60% of total starch, and therefore, present a unique challenge to management, as these structures cannot be reached directly with herbicides. Therefore, management should target the aboveground tissue before peak production (July) to reduce internal starch storage and aim to limit regrowth over several years.
Major depressive disorder (MDD) is the leading cause of disability globally, with moderate heritability and well-established socio-environmental risk factors. Genetic studies have been mostly restricted to European settings, with polygenic scores (PGS) demonstrating low portability across diverse global populations.
Methods
This study examines genetic architecture, polygenic prediction, and socio-environmental correlates of MDD in a family-based sample of 10 032 individuals from Nepal with array genotyping data. We used genome-based restricted maximum likelihood to estimate heritability, applied S-LDXR to estimate the cross-ancestry genetic correlation between Nepalese and European samples, and modeled PGS trained on a GWAS meta-analysis of European and East Asian ancestry samples.
Results
We estimated the narrow-sense heritability of lifetime MDD in Nepal to be 0.26 (95% CI 0.18–0.34, p = 8.5 × 10−6). Our analysis was underpowered to estimate the cross-ancestry genetic correlation (rg = 0.26, 95% CI −0.29 to 0.81). MDD risk was associated with higher age (beta = 0.071, 95% CI 0.06–0.08), female sex (beta = 0.160, 95% CI 0.15–0.17), and childhood exposure to potentially traumatic events (beta = 0.050, 95% CI 0.03–0.07), while neither the depression PGS (beta = 0.004, 95% CI −0.004 to 0.01) or its interaction with childhood trauma (beta = 0.007, 95% CI −0.01 to 0.03) were strongly associated with MDD.
Conclusions
Estimates of lifetime MDD heritability in this Nepalese sample were similar to previous European ancestry samples, but PGS trained on European data did not predict MDD in this sample. This may be due to differences in ancestry-linked causal variants, differences in depression phenotyping between the training and target data, or setting-specific environmental factors that modulate genetic effects. Additional research among under-represented global populations will ensure equitable translation of genomic findings.
Traumatic brain injury is one of several recognized risk factors for cognitive decline and neurodegenerative disease. Currently, risk scores involving modifiable risk/protective factors for dementia have not incorporated head injury history as part of their overall weighted risk calculation. We investigated the association between the LIfestyle for BRAin Health (LIBRA) risk score with odds of mild cognitive impairment (MCI) diagnosis and cognitive function in older former National Football League (NFL) players, both with and without the influence of concussion history.
Participants and Methods:
Former NFL players, ages ≥ 50 (N=1050; mean age=61.1±5.4-years), completed a general health survey including self-reported medical history and ratings of function across several domains. LIBRA factors (weighted value) included cardiovascular disease (+1.0), hypertension (+1.6), hyperlipidemia (+1.4), diabetes (+1.3), kidney disease (+1.1), cigarette use history (+1.5), obesity (+1.6), depression (+2.1), social/cognitive activity (-3.2), physical inactivity (+1.1), low/moderate alcohol use (-1.0), healthy diet (-1.7). Within Group 1 (n=761), logistic regression models assessed the association of LIBRA scores and independent contribution of concussion history with the odds of MCI diagnosis. A modified-LIBRA score incorporated concussion history at the level planned contrasts showed significant associations across concussion history groups (0, 1-2, 3-5, 6-9, 10+). The weighted value for concussion history (+1.9) within the modified-LIBRA score was based on its proportional contribution to dementia relative to other LIBRA risk factors, as proposed by the 2020 Lancet Commission Report on Dementia Prevention. Associations of the modified-LIBRA score with odds of MCI and cognitive function were assessed via logistic and linear regression, respectively, in a subset of the sample (Group 2; n=289) who also completed the Brief Test of Adult Cognition by Telephone (BTACT). Race was included as a covariate in all models.
Results:
The median LIBRA score in the Group 1 was 1.6(IQR= -1, 3.6). Standard and modified-LIBRA median scores were 1.1(IQR= -1.3, 3.3) and 2(IQR= -0.4, 4.6), respectively, within Group 2. In Group 1, LIBRA score was significantly associated with odds of MCI diagnosis (odds ratio[95% confidence interval]=1.27[1.19, 1.28], p <.001). Concussion history provided additional information beyond LIBRA scores and was independently associated with odds of MCI; specifically, odds of MCI were higher among those with 6-9 (Odds Ratio[95% confidence interval]; OR=2.54[1.21, 5.32], p<.001), and 10+ (OR=4.55;[2.21, 9.36], p<.001) concussions, compared with those with no prior concussions. Within Group 2, the modified-LIBRA score was associated with higher odds of MCI (OR=1.61[1.15, 2.25]), and incrementally improved model information (0.04 increase in Nagelkerke R2) above standard LIBRA scores in the same model. Modified-LIBRA scores were inversely associated with BTACT Executive Function (B=-0.53[0.08], p=.002) and Episodic Memory scores (B=-0.53[0.08], p=.002).
Conclusions:
Numerous modifiable risk/protective factors for dementia are reported in former professional football players, but incorporating concussion history may aid the multifactorial appraisal of cognitive decline risk and identification of areas for prevention and intervention. Integration of multi-modal biomarkers will advance this person-centered, holistic approach toward dementia reduction, detection, and intervention.
Traumatic brain injury and cardiovascular disease (CVD) are modifiable risk factors for cognitive decline and dementia. Greater concussion history can potentially increase risk for cerebrovascular changes associated with cognitive decline and may compound effects of CVD. We investigated the independent and dynamic effects of CVD/risk factor burden and concussion history on cognitive function and odds of mild cognitive impairment (MCI) diagnoses in older former National Football League (NFL) players.
Participants and Methods:
Former NFL players, ages 50-70 (N=289; mean age=61.02±5.33 years), reported medical history and completed the Brief Test of Adult Cognition by Telephone (BTACT). CVD/risk factor burden was characterized as ordinal (0-3+) based on the sum of the following conditions: coronary artery disease/myocardial infarction, chronic obstructive pulmonary disease, hypertension, hyperlipidemia, sleep apnea, type-I and II diabetes. Cognitive outcomes included BTACT Executive Function and Episodic Memory Composite Z-scores (standardized on age- and education-based normative data), and the presence of physician diagnosed (self-reported) MCI. Concussion history was discretized into five groups: 0, 1-2, 3-5, 6-9, 10+. Linear and logistic regression models were fit to test independent and joint effects of concussion history and CVD burden on cognitive outcomes and odds of MCI. Race (dichotomized as White and Non-white due to sample distribution) was included in models as a covariate.
Results:
Greater CVD burden (unstandardized beta [standard error]; B=-0.10[0.42], p=.013, and race (B=0.622[0.09], p<.001), were associated with lower executive functioning. Compared to those with 0 prior concussions, no significant differences were observed for those with 1-2, 3-5, 6-9, or 10+ prior concussions (ps >.05). Race (B=0.61[.13], p<.001), but not concussion history or CVD burden, was associated with episodic memory. There was a trend for lower episodic memory scores among those with 10+ prior concussion compared to those with no prior concussions (B=-0.49[.25], p=.052). There were no significant differences in episodic memory among those with 1-2, 3-5, or 6-9 prior concussions compared to those with 0 prior concussions (ps>.05). CVD burden (B=0.35[.13], p=.008), race (greater odds in Non-white group; B=0.82[.29], p=.005), and greater concussion history (higher odds of diagnosis in 10+ group compared to those with 0 prior concussions; B=2.19[0.78], p<.005) were associated with higher odds of MCI diagnosis. Significant interaction effects between concussion history and CVD burden were not observed for any outcome (ps >.05).
Conclusions:
Lower executive functioning and higher odds of MCI diagnosis were associated with higher CVD burden and race. Very high concussion history (10+) was selectively associated with higher odds of MCI diagnosis. Reduction of these modifiable factors may mitigate adverse outcomes in older contact sport athletes. In former athletes, consideration of CVD burden is particularly pertinent when assessing executive dysfunction, considered to be a common cognitive feature of traumatic encephalopathy syndrome, as designated by the recent diagnostic criteria. Further research should investigate the social and structural determinants contributing to racial disparities in long-term health outcomes within former NFL players.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Prior trials suggest that intravenous racemic ketamine is a highly effective for treatment-resistant depression (TRD), but phase 3 trials of racemic ketamine are needed.
Aims
To assess the acute efficacy and safety of a 4-week course of subcutaneous racemic ketamine in participants with TRD. Trial registration: ACTRN12616001096448 at www.anzctr.org.au.
Method
This phase 3, double-blind, randomised, active-controlled multicentre trial was conducted at seven mood disorders centres in Australia and New Zealand. Participants received twice-weekly subcutaneous racemic ketamine or midazolam for 4 weeks. Initially, the trial tested fixed-dose ketamine 0.5 mg/kg versus midazolam 0.025 mg/kg (cohort 1). Dosing was revised, after a Data Safety Monitoring Board recommendation, to flexible-dose ketamine 0.5–0.9 mg/kg or midazolam 0.025–0.045 mg/kg, with response-guided dosing increments (cohort 2). The primary outcome was remission (Montgomery-Åsberg Rating Scale for Depression score ≤10) at the end of week 4.
Results
The final analysis (those who received at least one treatment) comprised 68 in cohort 1 (fixed-dose), 106 in cohort 2 (flexible-dose). Ketamine was more efficacious than midazolam in cohort 2 (remission rate 19.6% v. 2.0%; OR = 12.1, 95% CI 2.1–69.2, P = 0.005), but not different in cohort 1 (remission rate 6.3% v. 8.8%; OR = 1.3, 95% CI 0.2–8.2, P = 0.76). Ketamine was well tolerated. Acute adverse effects (psychotomimetic, blood pressure increases) resolved within 2 h.
Conclusions
Adequately dosed subcutaneous racemic ketamine was efficacious and safe in treating TRD over a 4-week treatment period. The subcutaneous route is practical and feasible.
Background: Sex differences in treatment response to intravenous thrombolysis (IVT) are poorly characterized. We compared sex-disaggregated outcomes in patients receiving IVT for acute ischemic stroke in the Alteplase compared to Tenecteplase (AcT) trial, a Canadian multicentre, randomised trial. Methods: In this post-hoc analysis, the primary outcome was excellent functional outcome (modified Rankin Score [mRS] 0-1) at 90 days. Secondary and safety outcomes included return to baseline function, successful reperfusion (eTICI≥2b), death and symptomatic intracerebral hemorrhage. Results: Of 1577 patients, there were 755 women and 822 men (median age 77 [68-86]; 70 [59-79]). There were no differences in rates of mRS 0-1 (aRR 0.95 [0.86-1.06]), return to baseline function (aRR 0.94 [0.84-1.06]), reperfusion (aRR 0.98 [0.80-1.19]) and death (aRR 0.91 [0.79-1.18]). There was no effect modification by treatment type on the association between sex and outcomes. The probability of excellent functional outcome decreased with increasing onset-to-needle time. This relation did not vary by sex (pinteraction 0.42). Conclusions: The AcT trial demonstrated comparable functional, safety and angiographic outcomes by sex. This effect did not differ between alteplase and tenecteplase. The pragmatic enrolment and broad national participation in AcT provide reassurance that there do not appear to be sex differences in outcomes amongst Canadians receiving IVT.
Placement of fertilizer in the seed furrow to increase nutrient availability is a common practice in row-crop production. While in-furrow application of fertilizer is widely utilized in the production of winter wheat (Triticum aestivum L.), there is a lack of work evaluating new formulations and nutrient combinations that are available. The objective of this study was to quantify the effects of in-furrow fertilizer products and combinations of products on winter wheat grain yield, nitrogen and mineral concentrations. Trials were conducted across five site-years in central Oklahoma using 11 fertilizer formulations placed in-furrow at the time of planting. In locations that soil test phosphorus (STP) levels or potassium were above sufficiency, the use of in-furrow fertilizers did not improve yield over the control. Inconsistency of response was noted at locations where STP levels were below the critical threshold. While one location showed no response to the addition of P regardless of source, two other locations had significant yield responses from three or more P-containing fertilizers. The addition of both sulphur and zinc resulted in increased yield over the base product at one low STP location. Nutrient concentrations were also influenced in nutrient-limited soils; however, no trends in response were present. Based upon the results of this study, the application of in-furrow fertilizer has the potential to increase winter wheat grain yield and nutrient concentration, when soil nutrients are limiting. As expected the addition of fertilizer when soil test levels are at or above a sufficiency did not increase grain yield.
Neuropsychopharmacologic effects of long-term opioid therapy (LTOT) in the context of chronic pain may result in subjective anhedonia coupled with decreased attention to natural rewards. Yet, there are no known efficacious treatments for anhedonia and reward deficits associated with chronic opioid use. Mindfulness-Oriented Recovery Enhancement (MORE), a novel behavioral intervention combining training in mindfulness with savoring of natural rewards, may hold promise for treating anhedonia in LTOT.
Methods
Veterans receiving LTOT (N = 63) for chronic pain were randomized to 8 weeks of MORE or a supportive group (SG) psychotherapy control. Before and after the 8-week treatment groups, we assessed the effects of MORE on the late positive potential (LPP) of the electroencephalogram and skin conductance level (SCL) during viewing and up-regulating responses (i.e. savoring) to natural reward cues. We then examined whether these neurophysiological effects were associated with reductions in subjective anhedonia by 4-month follow-up.
Results
Patients treated with MORE demonstrated significantly increased LPP and SCL to natural reward cues and greater decreases in subjective anhedonia relative to those in the SG. The effect of MORE on reducing anhedonia was statistically mediated by increases in LPP response during savoring.
Conclusions
MORE enhances motivated attention to natural reward cues among chronic pain patients on LTOT, as evidenced by increased electrocortical and sympathetic nervous system responses. Given neurophysiological evidence of clinical target engagement, MORE may be an efficacious treatment for anhedonia among chronic opioid users, people with chronic pain, and those at risk for opioid use disorder.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
A breeding female’s perceived value is a complicated process and depends on a combination of expected production costs, reproductive success, and calf values. A conceptual asset value model based on female characteristics as signals and net implicit marginal value expectations is developed. A hedonic model based on sequentially sold individuals at multiple Mississippi auction locations is estimated by panel regression. Among other findings, pregnant females are discounted in proportion to abortion risk, which decreases toward birth. A follow-up cost/benefit analysis indicates producers are better off from at home pregnancy checking and selling only nonpregnant females or cow/calf pairs.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
$60+$
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
The Murchison Widefield Array (MWA) is an electronically steered low-frequency (<300 MHz) radio interferometer, with a ‘slew’ time less than 8 s. Low-frequency (∼100 MHz) radio telescopes are ideally suited for rapid response follow-up of transients due to their large field of view, the inverted spectrum of coherent emission, and the fact that the dispersion delay between a 1 GHz and 100 MHz pulse is on the order of 1–10 min for dispersion measures of 100–2000 pc/cm3. The MWA has previously been used to provide fast follow-up for transient events including gamma-ray bursts (GRBs), fast radio bursts (FRBs), and gravitational waves, using systems that respond to gamma-ray coordinates network packet-based notifications. We describe a system for automatically triggering MWA observations of such events, based on Virtual Observatory Event standard triggers, which is more flexible, capable, and accurate than previous systems. The system can respond to external multi-messenger triggers, which makes it well-suited to searching for prompt coherent radio emission from GRBs, the study of FRBs and gravitational waves, single pulse studies of pulsars, and rapid follow-up of high-energy superflares from flare stars. The new triggering system has the capability to trigger observations in both the regular correlator mode (limited to ≥0.5 s integrations) and using the Voltage Capture System (VCS, 0.1 ms integration) of the MWA and represents a new mode of operation for the MWA. The upgraded standard correlator triggering capability has been in use since MWA observing semester 2018B (July–Dec 2018), and the VCS and buffered mode triggers will become available for observing in a future semester.
A new fossil site in a previously unexplored part of western Madagascar (the Beanka Protected Area) has yielded remains of many recently extinct vertebrates, including giant lemurs (Babakotia radofilai, Palaeopropithecus kelyus, Pachylemur sp., and Archaeolemur edwardsi), carnivores (Cryptoprocta spelea), the aardvark-like Plesiorycteropus sp., and giant ground cuckoos (Coua). Many of these represent considerable range extensions. Extant species that were extirpated from the region (e.g., Prolemur simus) are also present. Calibrated radiocarbon ages for 10 bones from extinct primates span the last three millennia. The largely undisturbed taphonomy of bone deposits supports the interpretation that many specimens fell in from a rock ledge above the entrance. Some primates and other mammals may have been prey items of avian predators, but human predation is also evident. Strontium isotope ratios (87Sr/86Sr) suggest that fossils were local to the area. Pottery sherds and bones of extinct and extant vertebrates with cut and chop marks indicate human activity in previous centuries. Scarcity of charcoal and human artifacts suggests only occasional visitation to the site by humans. The fossil assemblage from this site is unusual in that, while it contains many sloth lemurs, it lacks ratites, hippopotami, and crocodiles typical of nearly all other Holocene subfossil sites on Madagascar.
A 2018 workshop on the White Mountain Apache Tribe lands in Arizona examined ways to enhance investigations into cultural property crime (CPC) through applications of rapidly evolving methods from archaeological science. CPC (also looting, graverobbing) refers to unauthorized damage, removal, or trafficking in materials possessing blends of communal, aesthetic, and scientific values. The Fort Apache workshop integrated four generally partitioned domains of CPC expertise: (1) theories of perpetrators’ motivations and methods; (2) recommended practice in sustaining public and community opposition to CPC; (3) tactics and strategies for documenting, investigating, and prosecuting CPC; and (4) forensic sedimentology—uses of biophysical sciences to link sediments from implicated persons and objects to crime scenes. Forensic sedimentology served as the touchstone for dialogues among experts in criminology, archaeological sciences, law enforcement, and heritage stewardship. Field visits to CPC crime scenes and workshop deliberations identified pathways toward integrating CPC theory and practice with forensic sedimentology’s potent battery of analytic methods.
We apply two methods to estimate the 21-cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uv-plane. The direct and gridded bispectrum estimators are applied to 21 h of high-band (167–197 MHz; z = 6.2–7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point-source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 h, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21-cm bispectrum may be accessible in less time than the 21-cm power spectrum for some wave modes, with detections in hundreds of hours.