Hostname: page-component-586b7cd67f-rcrh6 Total loading time: 0 Render date: 2024-11-25T10:11:46.310Z Has data issue: false hasContentIssue false

Kaleidoscope

Published online by Cambridge University Press:  27 April 2020

Rights & Permissions [Opens in a new window]

Abstract

Type
Kaleidoscope
Copyright
Copyright © Royal College of Psychiatrists 2020

The practical implications of neuroscience research are often not immediately obvious; we examine post-traumatic stress disorder (PTSD) as a potential exemplar. Despite a raft of clinical data on PTSD, the underlying changes in brain networks are not well described. Contemporary neurophysiological models posit a deficit in learning and memory, particularly in the brain's fear circuitry. However, it is not clear why traumatic memories should persist in some people but not in others – even if exposed to the same event. Writing in the journal Science, Mary et al worked with over a hundred survivors of the 2015 Paris terrorist attacks on the Bataclan theatre.Reference Mary, Dayan, Leone, Postel, Fraisse and Malle1 Some had developed PTSD (‘PTSD+’), some had not (‘PTSD–’), and they compared them with non-exposed healthy individuals as a control. All participants underwent functional neuroimaging to explore how their dorsolateral prefrontal cortices (DLPFC) regulated intrusive memories. In those without PTSD (PTSD– and controls) the right anterior DLPFC was shown to reduce responses in two key memory areas: the hippocampi and the precuneus. Specifically, there was a reduced functional coupling between these areas when individuals attempted to prevent intrusive memories entering consciousness. In other words, these ‘healthy’ individuals were able to adaptively suppress memory activity; something those with PTSD+ were unable to do. The findings demonstrate the role of memory suppression systems in PTSD, with disruption in the regulation of unwanted thoughts and the mechanism used to banish them. The authors use the metaphor that for those with PTSD, trying to suppress memories is akin to ‘efforts to slam on a faulty brake’. To the clinical application: traditional PTSD interventions focus on the trauma, with re-exposure to circumvent memory avoidance and suppression, allowing traumatic memories to be safely contextualised and extinction promoted. The authors propose that the novel findings offer a paradigm shift and new therapeutic option through targeting memory system disruption and the associated inhibitory control system that could utilise neutral material unconnected to the trauma.

Ticking away the moments that make up a dull day, fritter and waste the hours in an offhand way. Science has been used to assess our vulnerability to disease in many ways, but monitoring our epigenetic clocks is a relatively new approach. These genetic markers are adversely influenced by notable risk factors for ill health include poor diet, lack of exercise, smoking and socioeconomic disadvantage. Given that these are often considerably worse in those with chronic mental health disorders such as schizophrenia (causality not withstanding), and that such individuals typically also have a significant decrease in life expectancy, it is surprising that Horvath's DNA methylation (DNAm)-based clock, a standard biomarker of chronological age, has failed to show accelerated ageing in this clinical group. Perhaps this standardised technique just failed to capture low-hanging fruit, and recently, a slew of additional epigenetic clocks that capture unique aspects of ageing have been established. Higgins-Chen and colleagues assessed the DNAm from 567 individuals with schizophrenia and 594 controls (aged 30–90 years) on 14 of these epigenetic clocks.Reference Higgins-Chen, Boks, Vinkers, Kahn and Levine2 They replicated Horvath's (lack of) finding, but also showed alterations in other markers responsive to different aspects of ageing. There was a significant impact on mortality clocks, superior to chronological clocks in some predictions, with a 5.8-year acceleration found in those with schizophrenia, largely influenced by smoking. Interestingly, a deceleration of the mitotic clock was found that aligns with the provocative finding of a reduced cancer risk for some with schizophrenia, despite the presence of several adverse lifestyle factors. However, this was less pronounced in women, which is consistent with reduced risk not applying to breast and endometrial cancers. Finally, although medication data was only available for 74 individuals in the schizophrenia data-set, clozapine use was associated with a huge 7-year deceleration of chronological age on one clock (intriguingly, again, only for men). There are many possibilities of meaning here, but it is certainly an area to watch in the future. Epigenetic clocks offer a new way of understanding the biological basis of, and relationships between, various vulnerability factors evident in mental health conditions.

Albert Camus said ‘Some people talk in their sleep. Lecturers talk while other people sleep’. People have an endogenous diurnal sleep–wake clock, and two distinct chronotypes are often broadly characterised: ‘larks’ who wake and sleep early and ‘owls’ who wake and sleep later. Children and adolescents almost universally go to school early in the morning, which is why commuting is hell, yet have a predisposition (particularly older adolescents) to waking later in the day. Waking time is largely entrained to such ‘societal demands’ so owls have to wake early – despite their biology – and misalignment of waking and a person's chronotype is called social jetlag. How might this have an impact on young people's academic performance? Goldin et al report on a randomised controlled trial in a school in Buenos Aires where 753 adolescents were randomly allocated to three schooling schedules: early (starting at 07.45 h), afternoon (12.40 h) or late (17.20 h) starts.Reference Goldin, Sigman, Braier, Golombek and Leone3 Two age groups were studied: 13–14 years and 17–18 years. First, they tested if the three schooling schedules resulted in an adjustment of participants’ chronotype with consequent loss of sleep duration. Unsurprisingly, they found that early morning and afternoon schedules resulted in loss of sleep duration (below the recommended 8 h) in later chronotypes that could not be compensated by daytime napping. The evening schedule, in contrast, was sufficiently late to allow for all chronotypes to achieve the recommended daily sleep duration. Second, they investigated the effects of age, chronotype, sleep duration and school timing schedule on academic performance. Three hypotheses arise: (a) synchrony of chronotype with school time explains performance (larks do better with early starts, owls do better for later starts); (b) chronotype alone explains school performance (larks just do better than owls – or vice versa – no matter what time school starts); and (c) both chronotype and synchrony affect performance.

Overall, the synchrony hypothesis partially explains a small differential in performance, with owls seeing most benefit (increased performance) in languages (but not maths) with evening classes. They could not fully disambiguate a chronotype-only effect but note that this is in part because of the effect of age on chronotype (younger students have earlier chronotypes than older students) and conclude, given their study design, that the third hypothesis appears to be supported best. The practical conclusion from their study is that, in older adolescents, early and afternoon start times are equally detrimental for social jetlag and loss of sleep duration. Further, although early starts partially entrain student's chronotype, this was not enough to counter the consequent loss of sleep even in the naturally early-chronotype students. An interesting cultural factor was identified: in this Argentinian study, there were more late chronotypes than other countries. Their explanation is because in Buenos Aires, dinner time is typically 21.00 h and social life continues later into the night even for adolescents. The findings, including the description of life in Buenos Aires, appeal to the author's clearly established owl chronotype and resentment about the world being run by larks. Capturing hearts and minds: will schools around the country start to heed adolescents’ renewed calls that science tells us to let them lie in in the mornings? We wish them well with that endeavour…

Ketamine as a novel antidepressant takes us beyond monoamines, but how does it work, and how can we sustain its typically short-term effects? Ketamine is an N-methyl-d-aspartate receptor antagonist, but also influences a range of downstream changes, including increased activation of another glutamate receptor, AMPA (α-amino-3-hydroxy-5-methyl-4-isoxazole propionic acid), changes in intracellular proteins and synaptogenesis. An improved understanding of the therapeutic mechanisms might allow us to refine treatments and develop new pharmaceuticals. One emerging suggestion is an action via activation of the mammalian target of rapamycin complex 1 (mTORC1), which has multiple intracellular roles including as a nutrient sensor and controller of protein synthesis. Preclinical work has shown its signalling has been enhanced by ketamine administration. Some blue-sky thinking from Abdallah et al, who randomised just over 20 individuals with major depressive disorders to receive either rapamycin, an mTORC1 inhibitor, or placebo a couple of hours before a standard ketamine injection.Reference Abdallah, Averill, Gueorguieva, Goktas, Purohit and Ranganathan4 If ketamine works via activation of mTORC1, one would hypothesise the rapamycin group would thus show far less benefit. Fascinatingly, this not occur, but in fact the opposite was seen: across 2 weeks the rapamycin group did significantly better in terms of response and remission rates, with a quadrupling in the latter. It is not entirely clear what was driving this change but it is possible that the orally administered rapamycin dose was insufficient to block mTORC1. Rapamycin has anti-inflammatory properties and these might be directly antidepressant and thus augment the ketamine synaptogenesis. A final possibility suggested by the authors is that mTORC1 regulates autophagic mopping up of dysfunctional cellular components, which has been implicated in depression, and that rapamycin may have an impact on that. The clinical roll-out and administration of therapeutic ketamine is moving forward; if these findings are supported in the future, they might offer a way to enhance ketamine effectiveness while simultaneously reducing the number of drug administrations.

Finally, we have inserted a managerial bullshit bingo phrase into each of the above pieces; you need to learn to spot this. We previously wrote about detecting individuals who bullshit; but is it particularly damaging in the workplace?Reference Tracy, Joyce and Shergill5 First it is important to delineate bullshitting from lying: lying is intentionally distorting the truth for one's own benefit; bullshit is where the truth just does not matter. For example, a liar might promise vast sums money for a healthcare system or thousands of new nurses, knowing full well that this was not available; a bullshitter might make the same claim without checking or caring if it was possible. Indifference to truth is important; data suggest that the more of it that goes on, the worse the performance of an organisation. Further, while bullshitting is not a new phenomenon, the contemporary means and platforms to disseminate bullshit have increased exponentially, with proportionately increased potential of inflicting damage. McCarthy and colleagues assist, offering the following CRAP framework:Reference McCarthy, Hannah, Pitt and McCarthy6Comprehend bullshit is not lying (and may be appealing); Recognise its abstract nature, lacking logic and often riddled with acronyms and jargon; Act to disengage yourself and be prepared to confront it; and finally Prevent it through valuing critical thinking in your organisation and emphasise the need for evidence over opinion. The last point places a particular burden on senior staff – that's YOU – to lead by example and make sure their team, service or organisation roots out bullshit. They argue that we are more likely to bullshit when we feel obligated to provide an opinion, when an audience is less knowledgeable about a topic, and when there is no accountability for producing bullshit. Workplace democracy is fine, they note, but we all have obligations to call out bullshit when we hear it. We all fall occasionally into the trap but, it doesn't have to be this way. In a post-truth world, confronting indifference to truth matters to us all; you have to, ahem, take the bull(shit) by the horns.

References

Mary, A, Dayan, J, Leone, G, Postel, C, Fraisse, F, Malle, C, et al. Resilience after trauma: the role of memory suppression. Science 2020; 367: eaay8477.10.1126/science.aay8477CrossRefGoogle ScholarPubMed
Higgins-Chen, AT, Boks, MP, Vinkers, CH, Kahn, RS, Levine, ME. Schizophrenia and epigenetic aging biomarkers: increased mortality, reduced cancer risk and unique clozapine effects. Bio Psychiatry [Epub ahead of print] 8 February 2020. Available from: https://doi.org/10.1016/j.biopsych.2020.01.025.Google Scholar
Goldin, AP, Sigman, M, Braier, G, Golombek, DA, Leone, MJ. Interplay of chronotype and school timing predicts school performance. Nature Hum Behav [Epub ahead of print] 10 February 2020. Available from: https://doi.org/10.1038/s41562-020-0820-2.Google Scholar
Abdallah, CG, Averill, LA, Gueorguieva, R, Goktas, S, Purohit, P, Ranganathan, M, et al. Modulation of the antidepressant effects of ketamine by the mTORC1 inhibitor rapamycin. Neuropsychopharmacology [Epub ahead of print] 24 February 2020. Available from: https://doi.org/10.1038/s41386-020-0644-9.Google Scholar
Tracy, DK, Joyce, DW, Shergill, SS. Kaleidoscope. Br J Psychiatry 2016; 208: 201–2.CrossRefGoogle ScholarPubMed
McCarthy, IP, Hannah, D, Pitt, LF, McCarthy, JM. Confronting indifference toward truth: dealing with workplace bullshit. Bus Horiz [Epub ahead of print] 7 February 2020. Available from: https://doi.org/10.1016/j.bushor.2020.01.001.Google Scholar
Submit a response

eLetters

No eLetters have been published for this article.