We began this book by asking a series of sensitive ethical questions about hidden histories of the dead of the sort that tended to be sidestepped or concealed from public view inside the British scientific community of the twentieth century. This new approach has not sought to detract from the many collective achievements of the medical sciences, which have been profound for us all as patients of more advanced healthcare systems in a global community. Rather, it is asking us to reflect on a historical context which has had many missing pieces of a complex medical humanities jigsaw. For the recycling of human research material was a subject that few people knew much about, and the management systems for which tended to be taken for granted. Often, they were opaque, hidden from public view. Even those working inside the system thought that older statutes covered their monitoring of medical ethics, but they did not. Few audit processes kept pace with the development of biotechnologies after WWII. As a result, body disputes started to highlight for public scrutiny discrepancies that had occurred inside NHS hospitals or involved even reputable UK research establishments. At first, these were judged exceptional, and then gradually there was a recognition in medical circles that some abuses and discrepancies were normal. This came about because one aspect of medical confidentiality involved the objectification of human research material. This created a bio-commons which had, and has, been necessary to push the boundaries of medical knowledge.
In the historical archives, a related missing human perspective is how exactly and for what research purposes bio-commons was disaggregated from the 1950s. The extent of the removal of personal identity from body and body part ‘donations’ likewise raised, and raises, questions of dignity in death. De-identifying human material may have fulfilled the medical obligation of discretion, but it equally left undocumented the nature of potential body disputes involving the public. There similarly seemed to be a lack of maintenance of humanisation inside modern research cultures. The degree to which we could remap actor networks and their research threshold points was thus an important historical endeavour since that missing information could reveal the logistical costs, timings and staffing resources that shaped the material realities of medical ethics in post-war Britain. Exploring these neglected histories of the dead has shown that the ‘work of the dead’ always matters to the living in some respect, especially for those that have benefitted from an extension of the deadlines of life after 1945.1
In the course of this book, ethicists and moral philosophers, sociologists, economists, transplant surgeons, hospice staff, experts in resuscitation medicine, neuroscientists and the public have each played a part in ongoing ethical debates about the need to adopt a ‘custodial’ rather than ‘proprietorial’ view of the body today.2 This was played out against a transition from an older ethics of conviction (patriarchal medical experts, authoritarian and inward looking, prioritising their exclusive research agendas) to a new ethics of responsibility (reflecting much more medicine’s impact on society as a whole, economically, culturally and politically). Whilst HTA2004 tried to bridge this ideological gap, it never resolved some fundamental differences of opinion. Historically, there is often a time lag between the passing of legislation and genuine cultural change. There thus remain considerable levels of scepticism amongst some professional experts about whether or not to open up and share medical science’s inner working practices with everybody. That debate is healthy and reflects that science has a curiosity-driven and enquiring nature, but it also highlights how for too long there has been a cultural gap between the working practices of science (open, enquiring, debating and disputing to disprove hypotheses) and the ways in which it has interacted with ordinary patients and their relatives (denying body disputes; controlling information flows; being evasive, furtive and paternalistic, as well as often caring but overprotective). At a time when precision medicine is just around the next historical corner, the medical sciences are facing some fundamental ethical choices because of the legacy of hidden histories of the dead. They need to embrace a world in which DNA coding will democratise how we see and interact with a newly visible self. At this research frontier, the old death sentences of the past are being delayed and we stand on the threshold of new scientific eternities that challenge our historical imaginations and patient-practitioner working relationships.
Henceforth with each new biomedical step we take, close monitoring of our medical ethics is going to be very necessary; otherwise, we could find that we arrive at new healthcare solutions but are ill informed about their human costs because we neglected perspectives of hidden histories of the dead. Such ethical questions matter because the bedrock of our medical philosophy is public support. Often science has neglected how much this is in a constant cultural process of negotiation and re-negotiation, as we are seeing in the media with the current pandemic, Covid-19. Social media too is a force for change, but it would be a mistake to think that its public engagement reach was triggered by recent technological development alone. In many respects, scientific reticence about clandestine medical research cultures after WWII created the preconditions for more vocal and visible patient-led perspectives to start to reshape public opinion from the 1960s in Britain, and beyond its shores. The substantial data employed in this volume has allowed us to explore existing historiographical agendas, and to set new ones. Thus, a recent case in the Family Division of the High Court in London personifies the medical possibilities that have been created, as well as the potential body disputes that do still arise as we continue to remap the human atlas.
Remapping the ‘Human Atlas’
In 1972, the Alcor Life Extension Foundation (hereafter ALCOR) founded a new not-for-profit nanotechnology venture in the USA. It promised to explore the medico-scientific potential of cryonics research, hoping that a future technology known as transhumanism (patients integrated with machines) would have the facility to revive human material frozen with nitrogen. In many respects, ALCOR was a logical development of the transplantation era, copying the technique of freezing ‘solid’ organs and human eggs for future use. ALCOR today stresses that ‘it is not an interment or mortuary practice’. It maintains that medical death is a more liminal state than conventional medicine currently understands.3 Thus, it seeks to preserve the brain ‘as soon as possible after legal death’ so as to ‘prevent the loss of information within the brain that encodes memory and personal identity, which is the true boundary between life and death’. The ALCOR staff stress that: ‘cryonics is an extension of critical medical care … if cryonics patients are preserved well enough … they might someday [sic] be resuscitated … then they aren’t dead: they are cryopreserved’. In 2015, the promise of this biotechnology to push the boundaries of life and death attracted the attention of a British teenage girl dying of cancer. She thought it offered her the promise of a scientific eternity – part of the legacy of hidden histories of the body – stimulating new conversations in the scientific community today.
‘JS’ (her name was anonymised to disguise her identity in the press) thus applied to the Family Division of the High Court in London to be cryopreserved. In a letter to the adjudicating Judge, she wrote:
The judge was concerned that the case ‘did give rise to serious legal and ethical questions for hospitals’. However, he exceptionally agreed to the dying request. On 17 October 2016, JS went into hospital for the final time in London. Her post-mortem wishes were respected, but not without controversy. JS became simultaneously an implicit, explicit and missed body dispute.
JS believed in a medical technology that was implied and unproven. She placed her secular faith in the promise of a medical scientific eternity – that there would be a future cure for her cancer. Her resuscitated brain, she thought, would survive medical death by cryopreservation, until humans and machines could function together to maintain life. Yet media commentators and family members questioned whether this reasoning was ethical and rational or simply science fiction. What was implied may not be deliverable: a potential implicit dispute sometime in the future. Her estranged father did not agree with his teenage daughter’s decision, generating an explicit body dispute. Although JS’s parents had separated acrimoniously in 2002 when she was aged 6, and there had been no contact with her biological father since 2007, he still felt responsible for the unverified medical procedure she wanted.5 In the absence of a robust scientific study, he thought that his daughter ‘had been “brain-washed” into thinking she could cheat death’.6 Meanwhile, JS’s mother and grandparents became involved in a missed dispute. In court, they supported the teenager’s request to be frozen. The grandparents paid ALCOR a fee of ‘£37,000’. Yet, as events soon proved, the mishandling of her body attracted widespread negative publicity and revealed the close family’s misunderstanding of what was about to happen next.
Cryonics UK clashed with the medical team on site, due to concerns about a lack of dignity in death. The cryopreservation personnel appeared to be ‘under-equipped and disorganised’ after an ambulance, due to collect JS’s body, broke down and was replaced by a volunteer’s van.7 Procedures were hasty and haphazard, and had to be moved to a hospital morgue where there was a ‘rush to replace JS’s blood with anti-freeze to cool her body to –70˚C’.8 Disquiet amongst doctors and the mortuary staff resulted in a case referral to the Human Tissue Authority (who were in fact powerless to act retrospectively). JS’s mother’s focus had been to carry out her daughter’s dying wishes, but the procedures she was promised, and those the court had consented to, were questionable. According to a detailed report in the Daily Telegraph, ‘a cousin’ of the child’s mother admitted ‘there had also been misgivings on that side [maternal] of the family’ about what had taken place and family members would have objected had they known what was about to happen.9 This missed dispute resulted in JS’s body being ‘stored upside down in a vat of liquid nitrogen at –196˚C … a week after her death, her body, packed in dry ice, [was] flown to Michigan in the US’ for safe storage with ‘100 other “patients” awaiting revival’ in the Cryonic Institute in Detroit.10 What everyone involved in the case would now do was to query the inside story of JS’s deadline of life and the biomedical promise of scientific eternity.
Matthew Parris, writing in The Times on 19 November 2016, queried ‘JS’s sad case’. As he put it: ‘Snap-freezing yourself into immortality is surely a medical dead-end?’11 He did not doubt that in the future a cure for many types of cancers would be made by biomedicine, but that did not excuse sidestepping the really big ethical questions facing us now – ‘When does “life” in any meaningful sense, end? When should it? How much room is there on our planet for contemporaneous human lives – and could we – should society – reach a shared understanding about the limits?’ What the JS case had shown was that life expectancy has ‘a sliding scale’ – many people can be in a situation of a living death – what if this young girl were to wake up in 200 years or so and find she is only ‘partly alive’? If her brain was not damaged, then she would in theory lead a ‘useful [sic] life’. But if semi-damaged, she could be condemned to a life-support machine. What would her quality of life be when adjusting to such a different concept of a normal life that might be beyond her powers of comprehension as a human being? As Yuval Noah Harari puts it: ‘Trans-humanism seeks to upgrade the human mind and give us access to unknown experiences and unfamiliar states of consciousness.’ As a global community we need, however, to think a lot more carefully about how ‘revamping the human mind is a complex and dangerous undertaking’.12 It is possible, he points out, that: ‘We may successfully upgrade our bodies and our brains, while losing our minds in the process.’13 For, this legacy of hidden histories of the body of the modern era is not far-fetched. It has very recently led us in new mind-altering research directions that moral philosophers in the 1950s warned could happen.
The Royal Society in September 2019 issued a press release to the BBC warning of the future dangers of technologies with the facility to brain hack. Their spokesperson explained: ‘Devices that merge machines with the human brain need to be investigated … gadgets, either implanted in the body or worn externally, that stimulate activity in either the brain or nervous system’ are groundbreaking, but they raise serious ethical issues too.14 Whilst spinal cord stimulators, cochlear ear implants, electrodes planted into patients with paralysis, deep brain electrical stimulus of those with Parkinson’s disease, artificial pancreases, wireless heart monitors and so on are promising innovations, equally there are three ‘future possibilities of neural technology’ that require more ethical monitoring:
the ability to beam a ‘neural postcard’ to someone so they could see what you see even if they are not there
people being able to converse without speaking through access to each other’s thoughts
people being able to simply download new skills15
Keeping the peace in this brave new world of biotechnology and AI robotics might necessarily involve ‘the narrow interests of governments, armies and corporations’ creating the need to downgrade humans to better control their transhuman revolution.16 As Dr Tim Constandinou, Director of the Next Generation Neural Interfaces (NGNI) laboratory at Imperial College London and co-chair of the recent Royal Society–sponsored report, warned:
By 2040 neural interfaces are likely to be an established option to enable people to walk after paralysis and tackle treatment-resistant depression, they may even have made treating Alzheimer’s disease a reality. While advances like seamless brain-to-computer communication seem a much more distant possibility, we should act now to ensure our ethical and regulatory safeguards are flexible enough for any future development. In this way we can guarantee these emerging technologies are implemented safely and for the benefit of humanity.17
The fine line between far thinking and being far-fetched really depends on where you sit in the cryopreservation pool of public opinion. Yet, anatomists have always known that: ‘The brain – the mind – is the manifestation of the liminal spaces into which doctors’ plunge. It is ‘where personhood resides, of ourselves and our loved ones’ and we should, therefore, go gently in a Genome era.18
Dissection teaches us that in all centuries, ‘anatomy takes a nasty turn once we go above the neck – not only does the information increase in detail like crazy (the skull is amazing in its intricacy – seemingly endless numbers of holes, indentations, seams, processes)’ but ‘the force necessary’ to know more can ‘feel barbarous’ too – something that we also saw in Chapters 3 and 6. As one medical student conceded recently – ‘I am still fascinated by what is revealed’ in brain dissections ‘but hate the push and tug necessary for revelation’.19 It is a sentiment at the heart of the JS case and one awaiting us around the next historical corner. Professor George Santayana thus observed that we would remain ‘infantile’ in our medical ethics if we did not resolve the paternalism of the past together, forever ‘condemned to repeat mistakes’.20 If, then, JS embodies the key research themes of this book (implicit, explicit and missed disputes), and highlights how these research thresholds are not necessarily sequential but can in fact happen in combination too, what challenges await us and how might we confront them?
Remapping and Remodelling the Dead-End of Life
There are first numerous hidden histories of the dead-end of life. Broadly speaking, the New Poor Law helped to establish medical education’s research base in the late-Victorian period. This publicity-shy anatomical teaching and research culture carried forward into the 1930s economic depression. After wartime, it emerged under the new NHS as public healthcare was reorganised in 1948. Thereafter, it flourished in the fast-moving climate of medical enterprise during the 1950s. There was consequently a privileging of certain research cultures to cement professional status and secure grant funding from successive central governments of all political persuasions. This created the context for a burgeoning bureaucracy that was confusing and convoluted, and authorised the ambitious in their chosen career paths to ‘go around the law while going through the legal processes’. This, as the ethnographer Marie Andree Jacob puts it, ‘is how legality is experienced’ in modern medical research cultures.21 This material fact in hidden histories of the body also came about because of medical science’s insistence on the ‘global’ over the ‘local’. As Jacob explains, bio-commons acted as a buffer to proper public accountability. The task of the historian is to ‘privilege the microscope over the telescope’: to trace actor networks and their research threshold points in body and body part disputes, which were the central focus of Part II.22 Chapters 4–6 have thus demonstrated the historical research reach of quantitative and qualitative research methods. Figure 7.1 is therefore a template applicable to all such studies in the future, whether on a national or international basis. It illustrates the multilayered material pathways that facilitated research networks and those threshold points that the dead passed through as bodies were broken up in a complex but secretive chain of supply mechanisms. This raises important ethical questions because it provides an opportunity to engage with a more personalised history of the body at the end of life and to consider how historical longevity might still be shaping our world today. We have glimpsed part of this medical mosaic of progress, but we need to know much more since it is the foundational story of biomedical research now and in the near future.
Until we can remap in their entirety the human atlas and its research components, we will never know just how much modern medical research became a breakers-yard business of the body. Nor will it be feasible in biomedicine to identify why some medical breakthroughs within actor networks were deemed profitable for public healthcare, and others not. What we still do not know is what healthcare interventions were missed or mislaid because the system of accountability was so secretive (due either to carelessness or because those involved were caring but over-cautious). We also cannot tell what funding decisions were taken for political reasons. Nobody can therefore say for certain whether crucial medical information may still be awaiting our rediscovery. How sad it would be if human beings had suffered more in the meantime from painful conditions. As Donna Dickenson highlights, it remains all too common that a patient consents to bequest their human tissue, but then discovers it is recycled for someone’s career or commercial gain. Too often, it is still very difficult to define the sharing of knowledge or profits generated.23 In other words, as Dickenson emphasises: ‘Researchers, biotechnology companies and funding bodies certainly don’t think the gift relationship is irrelevant: they do their very best to promote donors’ belief in it, although it is a one-way gift-relationship.’24 That is the real ethical danger of consignments at the dead-end of life.
To name the dead still matters and remains an important endeavour of medical ethics. It is necessary to be dispassionate about medical research, but equally the medical humanities need the balancing mechanism of human stories that test whether progress has ethical probity. Although these are concepts under constant negotiation in popular culture, they require everyone in society to stay engaged. In the immediate post-war world, the opposite happened. Science’s self-defence position was that to name bio-commons was an impossible task. Certainly, it was logistically difficult and complex, but not unachievable. It would be more historically accurate to say that the balance of the evidence in this book points to the medical sciences seldom trying to humanise its research methods. As a research community, those that staffed systems had little idea of whether it was an insurmountable task or not, since so few checked its feasibility in the first place. The evidence in Parts I and II suggests, strongly, that it is a fundamental basic human impulse that material afterlives merit an acknowledgement of some description, one embodying the ethics of the ‘gift relationship’.25 Tracking human material has considerable merit today. For the future, giving it a named post-mortem passport and making it part of a transcript of forthcoming transplant treatments would more transparently connect the ‘gift’ to new healing cultures. Too few medical research studies did this in the post-war biomedical community. It is strictly speaking legally correct that they were not expected to do so at that time.26 Yet, that status quo does not excuse those who made conscious choices to be evasive and not engage with the changing world around them at the time. For the NHS is state financed, but much medical research remains embedded in private-sector funding contexts with investment targets to meet. Given that context, it was, and is, reasonable of the general public to suppose that just as medicine has embraced new scientific breakthroughs on the basis that these would benefit humankind, equally it should have devoted as much energy, money and time to being forward thinking in its research ethics too.
Medical Elisions
Contemporary critics of the medical sciences argue persuasively that what we are currently living through is a data explosion of personal information. It is becoming available in multiple online formats and requires responsive medical ethics as well as constant vigilance.27 This book’s second major finding, however, presents a much more multilayered historical picture than this. Medical science has been all about positioning itself centre stage as a profession in Western society. It tells us that we must ‘follow the science’ and trust in its data collection methods that help us all to make better healthcare decisions. But there is often little public discussion about the sheer amount of data collection this requires, how confusing and complex its results can be, and the ways in which scientists often disagree with each other concerning their findings and modelling of disease patterns: cultural trends we are witnessing during the Covid-19 pandemic. An added complication has been that the data the general public thought was the basis of our collective decision-making in medical ethics has been insubstantial and therefore akin to standing on scientific quicksand. Chapters 4–6 have shown that bureaucracy was used to hide what was really going on with personal data and patient case records. Typically, this happened by filling in a general form to pass a body from hospital ward to mortuary attendant – then to coroner and their pathologist – before putting a brake on that bureaucracy to elongate the time spent with the corpse for teaching and research purposes. In this way, official death certification often did not happen for up to two months after medical death. In the meantime, in law the dead did not exist. They were technically ‘abandoned’ and therefore their bereaved relatives could not have traced them, even if they had known what was really happening inside the system of so-called bequests. The symbolic cases of the dead war-hero in Chapter 1, a deceased young child in Chapter 4 and the sad demise of Mr Isaacs in Chapter 6 show this very well. Medical science has therefore been all about creating what this book has identified as the extra time of the dead, and this has been the basis of what some critics are now calling the ‘Data Religion’ of our biomedical world.28
The new evidence presented establishes that medical science is a major time player in Western society. Often its significant medical breakthroughs have been presented in the media as edited highlights – as the ‘chosen moment’ of a success story. This use of elision may have had a narrative efficiency in science and far-reaching medical benefits, but it also relied on there being many hidden histories of the dead in medical research and considerable public ignorance. A lot of medical information that was being collected was partially disclosed, often destroyed and certainly de-commissioned (involving many sorts of valuable research archives), without thinking through future timescales or potential ethical lessons. It is one of the reasons that patient groups and their online medical communities exist in such proliferation today; its storytellers have been sceptical about the use of medical elision in the recent past. Balancing such views, scientists are complex actors in their own right; they are shaped by cultural, political, economic and administrative circumstances. Yet, as this book has shown, they do not stand outside narratives of popular culture.
On a case-by-case basis – and there were some 10,000 cases reconstructed for this book – what one often engages with is the material fact that: ‘Dataism adopts a strictly functional approach to humanity, appraising the value of human experience according to their functioning in data-processing mechanisms.’29 As commentators like Harari point out: ‘If we develop an algorithm that fulfils the same function better, human experiences will lose their value.’ This is not as far-fetched as it might seem. In Chapter 2, we encountered the current concerns of the Royal Society of Medicine (hereafter RSM) membership, which reflected in 2016 on ‘the good, the bad and the ugly’ of HTA2004.30 They have foreseen a major ethical issue around dead human bodies disaggregated into bio-property. International patent law currently protects the medical sciences against litigation in the civil court for claims of a share of profits generated from re-engineering in biotechnology (altruistic, financial, patent or otherwise). Even so, as a keynote spokesperson, Hugh Whittall, Director of the Nuffield Council on Bioethics, said to the RSM: ‘The long-term challenge is the issue of tissue banking.’ What happens to ‘the huge amount of data … once you put it through any kind of biochemical or genetic analysis’?31 HTA2004 is not set up to monitor this, and once an algorithm has turned it into a data pattern there is no statute that can protect what happens next on the super-connective internet highway. Thus, critics like Harari highlight how: ‘Dataism undermines our main source of authority and meaning, and heralds a tremendous religious revolution, the like of which has not been seen since the eighteenth century.’32
The findings of this book suggest that ‘dataism’ certainly has the potential to ‘sideline humans’. Taking a longer trajectory, we can see that from 1752 to 1832 (phase 1) the body was studied as a reflection of the divine. From the 1790s that ‘old anatomy’ (the study of creation) gave way gradually to ‘new anatomy’, the study of the science of the body. From 1832 to 1929 (phase 2) Christian beliefs continued to dominate in dissection spaces – what historians call deo-centric, namely, God-centred belief systems. Nonetheless, from the 1930s to 1945 (phase 3) with the shift to more secular values in society, a homo-centric emphasis gained cultural ascendancy. Between 1945 and 2000 (phase 4) popular culture embraced the moral value of medical ethics and distanced itself from the religious tenets of the past. Finally, there was another noteworthy shift again around the time of HTA2004 when the deo-centric and homo-centric tipped in favour of a data-centric world. We did not, however, necessarily take forward the historical lessons from body phases 1–4, because until now they have been undocumented. This complex phased-in process is illustrated in Figure 7.2.33
Looking to the near future, DNA coders and systems biologists disagree fundamentally over what all the current genetic data generated really tells us about the basis of human existence. We are mapping the proteins of life at a ‘selfish-gene’ level, but individuals do not function in bits and pieces. As Denis Noble, the eminent systems biologist, explains, ‘the logic of life’ is to integrate, collaborate and work together to co-create what we often call the quality of life. That individual genes can undermine this living process is not disputed, but there is a holistic aspect to ‘the systems-level [organs, for instance] interactions of proteins’. In other words:
We have become transfixed by the great success in explaining sequences in terms of encoded DNA sequences. This is a great achievement, one of the most important successes of twenty-first-century biology. But we sometimes seem to have forgotten that the original question in genetics was not what makes a protein, but rather what ‘makes a dog a dog, a man [woman], a man [woman]’. It is the phenotype that stands in need of explanation. It is not just a soup of proteins [sic].34
Many hidden histories of the dead came about because anatomists, coroners and pathologists lost sight of their data. It was dispersed along all sorts of complex research pathways and the more it was distributed, the more difficult it became to keep track of the bigger picture of science. Historically, the evidence base confirms it became a breakers-yard business. The relatively recent recognition that brain death is very complex is a timely warning that ‘the self’ has been broken down too much in the past: after all, neuroscience has learned that only a whole person can function cognitively as a human being with a reasonable quality of life. Ironically, in embracing genomics, there is the very real possibility that we will repeat the same medical error of breaking the holistic circle that has peopled so many research threshold points in the post-war era. It is difficult not to arrive at the conclusion that we might need to unlearn what we think we knew because we only ever collected such a partial view of this past medical history. In the future, this acknowledgement could involve standing back and asking what sort of medical mosaic we let the medical sciences create for us in the first place.35
When Is Medical Death?
Humanity is in a mess – it has always been in a material mess and, thankfully, it will always be so until our last breath. Because we are such a muddle inside, we stay alive – being cleaned up, constantly excreting things surplus to requirements, forever shedding and spilling, dripping and squeezing, shaving and purging ourselves. Learning about this in a history of anatomy has revealed that defining medical death has been a significant medical conundrum. Today it is still common to read in the historical literature that brain death was redefined by medical science in 1968 at Harvard University.36 Yet, in Britain, anatomists re-defined medical death under the Murder Act of 1752. They discovered the possibility of reviving bodies in the winter cold after deliveries of the ‘dangerous dead’ to the dissection theatre from the public gallows. In a previous book by this author, it was established that brain death is a scientific realisation that can be dated to 1812 and it applied to one quarter of those executed for homicide that survived the hangman’s rope in England from 1752 to 1834.37 This book has built upon the foundation stone of that finding by identifying just how much this central dilemma in modern medicine was often hidden from public view. Because so much was secretive in the past, it is remains difficult to take the long view of brain death. Essentially, what happened is that by 1945, there had developed a ‘mind the gap’ in medical ethics. It can be located to the working definitions of peri-mortem (at or close to the point of death) and post-mortem (being in death). The assumption was that a liminal space resulted from the advancement of new technologies, with the ability to monitor even the faintest traces of life, and indeed better machinery did play a significant part in this historical process. Yet, it would be a mistake to presume that it was a specifically twentieth-century phenomenon.
It was the remarkable recent research of Professor Sam Parnia into near-death experiences that has questioned the tradition of calling medical death at a twenty-minute mark in emergency rooms in the USA.38 And it is a perspective today that surgeons who once received bodies from the gallows in Georgian England would have recognised. They had to make pro-life choices or break the Hippocratic Oath and commit human vivisection. In turn, because we did not share death’s dilemmas transparently as a Western society, we have neglected to engage with, and improve how we die. Medicine often employs euphemism to avoid speaking about the subject of death because doctors are powerless to stop it at some time in all our lives – ‘They have passed on’ – ‘S/he is no longer with us’ – ‘Your relative has gone before their time’ – gone where – passed on, to whom – whose timing do you mean? Because medicine handled the dying so clumsily, what happened to the dead followed suit. Intensive Care Units (ICUs) became in many respects the locations where the medical ethics of the living and the dead were developed but not necessarily with a transparent discussion about quality of life debates or material afterlives being created, as we have seen throughout Parts I and II. Moreover, researching this sensitive research area reveals that ICU cultures do differ between countries: a factor that the Covid-19 pandemic throws daily into sharp relief. This matters because we need to better appreciate whether what happened to the dead post-mortem was shaped by what was happening to the dying at the end of life when peri-mortem.
Public Policy and Engagement
In rediscovering archival material, we have encountered why neglected material realities in the recent past matter today and can stimulate new debates to better inform future public policy directions and their potential public engagement. We have seen that the data generated by anatomy departments in England from 1954 to 2000 shows that women have been the main source of body bequests since WWII. Neglecting this material fact has public policy ramifications when it comes to improving organ donation rates in Britain.39 Public health campaigns to increase organ donation have tended to target young people since the 1980s. The assumption has been that teenagers and those in their 20s are more forward looking than their parents and grandparents. Indeed, NHS2020 strategy to increase organ donation rates is designed to get the whole family involved so that medical science can ‘increase family consent rates to 80% by 2020’.40 In the midst of the pandemic, however, the NHS2020 impact on these figures has yet to be calculated. Nonetheless, to get to an 80 per cent level, the NHS2020 team concedes that there will have to be a major cultural change in British society. Young people, it is predicated, will need to talk in advance to their families and set out their dying wishes verbally and in print. The problem with this public engagement approach is twofold: first, few young people think to talk about death, and second, they seldom talk about something so unpalatable or write down their dying wishes. Most people need to be prompted to do so by someone close to them whom they respect and love enough to converse with. Historically, mothers and grandmothers tend to be the chief source of communication in families. That being the case, why is the NHS2020 strategy not working more closely with women – the sort of females who are good at getting loved ones talking and who have been so prepared to body bequest in the recent past? When we neglect hidden histories, it can have very real consequences for a patient on a long waiting list desperate for an organ transplant. It is time to work with the demographic realities of the female principle of gifting presented in this book.
The generic issue of compulsory organ donation likewise raises another important public engagement point that we have explored throughout Parts I and II. This book has argued that medical paternalism has been defined for too long by ‘proprietorial’ rather than ‘custodial’ property rights over the dead body.41 And in the most recent debates in 2017 around the need to introduce an opt-out of organ donation system in England, we see this out-of-date language emerge again. The law change means all adults are considered to have agreed to be an organ donor when they die unless they have recorded a decision not to donate or are in an excluded group. Yet, however needed the scheme, what this new policy reflects is how medicine tends to revert to its traditional default position when confronted with an undersupply of human material; it simply presses government to reintroduce past practices. This brings us to an important question – how does medicine advance, resolve long waiting lists for transplantation and balance the rights of all patients to respect their cultural and religious viewpoints? One of the implications of the work done for this book is the urgent need for a National Ethics Trust in Britain.42 Patients self-evidently want more say in medical treatment. They also need ethical safety-nets, especially as they approach the most difficult end-of-life decisions. The solution is a National Ethics Trust – a medical safety-NET for the near future. It has to be an organisation patients can trust to give their research profiles to – to help others to resolve pandemics on our behalf but also to help them make the most difficult decisions.43 If a NET were established, patients could decide to donate in life their health profiles to it. To secure public trust, it needs to be set up independent of government, politicians and the medical lobby. Just then as a patient can donate their body in death, why do we not as a society provide a mechanism for everyone to bequest their health profile whilst living?
Medical researchers could apply to a NET for access to NHS profiles, provided they in return use post-mortem passports and advertise future treatments detailing how many NET donors helped to make a medical breakthrough. There would be a list of those living bequests made public on an annual basis. Imagine being treated on the NHS and reading down the details of those that helped you to heal. That would be a very powerful ‘custodial’ expression of medical ethics for everybody. If a NET had existed in the recent high-profile cases of the young children Charlie Gard and Alfie Evans, in both of which there were legal challenges to the withdrawal of life support, then there would have been no need for the parents involved to crowdfund on the Internet due to a lack of legal aid, seek a public debate on social media or clash with doctors in court over medical evidence.44 An independent NET with the powers to call on relevant expertise could have been their impartial advocate. The NET could have consulted with medical ethicists, doctors and lawyers, as appropriate, to help each family make difficult end-of-life choices for their dying children. Although Charlie Gard’s parents have succeeded in getting a private members bill called Charlie’s Law passed in Parliament to set up a better medical advocacy scheme outside the law courts, there is no reason why this type of advocacy role could not be extended to everybody via a NET initiative.45 It would demonstrate that medical science has shifted culturally from an ethics of conviction to an ethics of responsibility – of international importance for everyone.
Pollution
Stop what you are reading – take a deep breath – pause for a moment – and think about just how lucky you are that your lungs filled up with air that was of good quality. Often life expectancy is about biological luck, but it is also about pollution levels where you live and work. The data generated over this author’s career on anatomy supply-lines now forms a research base that stretches from 1752 to 2000. One remarkable new finding is that no matter where those dissected and sent for medical research lived and died in Britain in the past 265 years, the majority all died from lung complaints (broadly defined). Such complaints (historically and in the present) have multiple causations. Thus, from 1752 to 1930, they were associated in the records used for this book (and two others that have preceded it) with substandard housing conditions, coal smog in cities and polluted river systems cleaned up by public health schemes laid down by the Victorian Information State. After WWII, central government nonetheless recognised that pollution in various forms was a growing cause of lung diseases and thus an urgent healthcare priority was to pass a series of Clean Air Acts, notably in 1952. Yet, instead of pollution diminishing as a major cause of death, one urban healthcare problem replaced another. Coal fires gave way to car smog. Consequently, asthma levels remained high and blighted major cities in the UK. They still do. At the same time, the cover-up story of pathology meant that coroners’ death certificates that should have been a treasure trove of epidemiological information had illegible handwriting. Most were filed and forgotten. There were also ongoing discrepancies in the design of the official death certification scheme in England and Wales. It has continually prioritised proximate cause of death and understated underlying co-morbidity complications: as Chapters 4 and 6 set in their proper historical context. This has been an enormous wasted opportunity for public health at the dead-end of life.
In January 2017 the leading journal Science reported that a prominent feature of modern biomedicine is ‘The Polluted Brain’.46 Globally there is a strong case to be made that car pollution may be one of the biggest factors in the growth of Alzheimer’s disease. As its lead article writer explained: ‘Some of the health risks of inhaling fine and ultrafine particles are well-established, such as asthma, lung cancer, and, most recently, heart disease. But a growing body of evidence suggests that exposure can also harm the brain, accelerating cognitive aging, and may even increase risk of Alzheimer’s disease and other forms of dementia.’ Although this is a young field of biomedical research, nevertheless there appear to be worrying epidemiological trends associated with greater car pollution levels in community medicine globally.47
One persistent problem often highlighted is a lack of historical, comparable, reliable data generated in the UK and Europe. The Guardian newspaper thus led with a startling headline in January 2017 that ‘1 in 10’ people tracked from ‘6.6 million’ participants who lived near traffic and its heavy road congestion appeared to be at a higher risk of dementia.48 More robust research was called for on ‘the impact of air pollution on public health’. In many respects, however, as this book has shown, the solution to this knowledge gap has been an obvious one. Multi-user data-sets from contemporary anatomy records assembled specifically for the chapters in Part II fill in the demography picture. Those who appear in dissection records prior to 1954 died below the threshold of relative to absolute poverty. This means that we can say with confidence that they contain important medical information of lives lived amidst the worst extremes of pollution. Combining those with modern record sets after 1955 when the Clean Air Act came into force then balances that historical picture by showing that even when fresh water supplies, better nutritional standards and free medical care under the NHS started to redress perennial Victorian social problems, pollution levels never abated. The automobile may yet prove conclusively to be the biggest cause of degenerate brain diseases, provided we stop losing sight of the importance of hidden histories of the dead and their long-term health profiles. Sometimes, in filling our lungs with air, it is the thing we cannot see that can make the biggest contribution to humanity.
A Historical Lesson for the Near Future
All books have their critics and this one will be no exception, but in concluding there is one final point to be made that unifies the human condition. If there is a central, undeviating narrative thread that runs throughout this research, it is that in not a single case study that has been examined, covering over half a million archive entries, has this author ever discovered an anatomist that did not respect the human capacity for dignity and love at the dead-end of life. Even what little was left after dissection was buried or cremated, eventually, with moral respect and full religious rites in Britain. There were no shortcuts, though there was ample reason to do so in a history of the marginalised and forgotten. It is a remarkable historical finding – perhaps the most notable of all in hidden histories of the dead, which we neglect at our peril. It is also a keen reminder of the old Tuscan saying that once inspired Leonardo da Vinci to quest for a knowledge that was more complete. Like all anatomists, he searched for the secret of the creation of life in the womb, dissecting at night to discover the beauty and wonder of our capacity for anatomical awe and embodied revelation. Even so, da Vinci never lost sight of the homespun wisdom of his Italian birthplace, where the old women that were midwives whispered to each new mother, ‘There is not love, only proof of love’.49 It is a moral philosophy, which in so many respects has gone on shaping human experience everywhere for everybody. It also happens to be the basis of all religious beliefs, every secular creed, as well as the entire history of medical ethics that stretches from ancient Greece to the present day.
For everyone wants to be loved completely in their lifetime. It might be the hand of a doctor that stretches out when we are ill, the smile of a nurse that comforts us in pain or the person close to us who hugs us to the end. The essence of oral histories and their histories of emotion – whether penned before or during a pandemic that has alerted us all to the power of medical events to take over our lives – continue to express words worth repeating at this finish-line that were first penned by Hippocrates – ‘Wherever the Art of Medicine is loved, there is also a love for Humanity’.50 In the medical humanities, empathy is seldom ‘proof of love’ until compassion is exchanged between two people. For this reason this book did not dissect stories of those dissected, offering the reader just a short summary; instead, it reassembled them with their emotional subtexts and material contributions because both perspectives are together intrinsic to the sort of narrative medicine that features in improved medical education today. If we meanwhile commercialise the Human Genome, then patients’ voices, motivated by this most basic and most important of human impulses, will step in and take back control from medicine. We underestimate, to our collective cost, the capacity for compassion and healing that one human being can feel for another. Precision medicine promises much but it cannot co-create in cultural isolation – this is our historical lesson for the near future too. The beauty of medicine at the bedside is a two-way conversation, and one valued by all of humanity. And because on this there is universal agreement in a global community, we therefore approach the inside stories of our scientific eternity on this historical horizon with perhaps the greatest challenge of all in biomedicine. Namely, never to put aside or dismiss offhand the undeviating central narrative of medicine that ‘what will survive of us is love’, because in a history of anatomy it has always done so and there is thankfully every expectation that it will go on doing so.51