Skip to main content Accessibility help
×
Hostname: page-component-586b7cd67f-g8jcs Total loading time: 0 Render date: 2024-11-22T12:54:30.119Z Has data issue: false hasContentIssue false

Chapter 1 - Working with IT Systems

The Benefits and the Challenges

Published online by Cambridge University Press:  23 November 2023

Rob Waller
Affiliation:
NHS Lothian
Omer S. Moghraby
Affiliation:
South London & Maudsley NHS Foundation Trust
Mark Lovell
Affiliation:
Esk and Wear Valleys NHS Foundation Trust

Summary

Self-awareness is a central characteristic of all complex systems including biological and organisational. The successful introduction of novel and complex IT systems to an organisation such as the NHS both enables, and requires, this self-awareness. IT systems promise great improvements in terms of process control, audit, governance as well as the direct delivery of clinical care. They require however that the organisation appreciates this kind of change is ‘adaptive’ as well as technical. Successful adoption requires strong, informed leadership, an honest appraisal of the degree of digital maturity, awareness of weaknesses and realism about timescales and scope. In this chapter, we will briefly survey the development of health informatics in the UK, examine some conspicuous examples of failed IT introductions and attempt to extract some lessons of use to clinicians asked to take part in such introductions. We will also survey some of the main current issues in mental health informatics from a UK perspective and make some tentative predictions for future developments.

Type
Chapter
Information
Digital Mental Health
From Theory to Practice
, pp. 6 - 21
Publisher: Cambridge University Press
Print publication year: 2023

Knowing yourself is the beginning of all wisdom.

Attributed to Aristotle

Introduction

Information flows have been fundamental to life since bacterial chemotaxis allowed our distant ancestors to approach nutrients and flee toxins. More complex organisms have developed ways of sensing the internal as well as external environment. Linked to responsive systems, self-knowledge of this sort is the foundation of the complex homeostasis seen in mammalian bodies.

Organisations and human enterprises are also complex systems at a different level of scale and share fundamental similarities such as a need to sense their environment (both internal and external), process the resulting information and translate it into advantageous actions.

Information technology (IT) should in theory allow this cycle to proceed at a more rapid pace and with less waste. Thus, organisations that adopt IT early and more completely should therefore reap a corresponding improvement in efficiency. That this has not always been the case is a commonplace observation that has become known as the Solow paradox, after the quip by Robert Solow that ‘you can see the computer age everywhere but in the productivity statistics’.Reference Krishnan, Mischke and Remes1

In this chapter, we will briefly survey the development of health informatics in the UK, examining some of the more notable examples of the Solow paradox, and try to generalise lessons for the jobbing clinician. We will review current areas of digital deployment in the NHS and look at future prospects, informed by the transformation seen in the recent Covid-19 pandemic.

History and Current State of Health Informatics

Pre-computer

It is said that the founder of medicine, Hippocrates of Kos, insisted on the structured recording of case histories for the benefit of future physicians.Reference Nissen and Wynn2, Reference McLean3 The sixteenth-century London Bills of Mortality laid down a list of 81 possible causes of death,Reference Morabia4 anticipating the use of controlled vocabularies, so-called ontologies, or systems of classification, which remain central to health informatics to this day. John Snow (1813–58), anaesthetist and public health physician, famously gathered data and produced what we would now call geospatial data visualisation to allow the location of a cholera outbreak to be determined (the notorious Broad Street pump).Reference Choi5 Florence Nightingale (1820–1910) too was an accomplished statisticianReference Neuhauser6 and made pioneering use of data visualisation.Reference Gupta7

The Electronic Health Record

With the introduction of computers in the 1960s, there was interest in creating systems to help with both clinical and administrative/clerical tasks. It soon became apparent that the ‘back office’ systems were an easier target for initial efforts. By the 1970s, IT systems were well established in this role in the United States, as well as in the UK in the form of patient administration systems (PAS) that counted and tracked patients but did little more. Lab and radiology systems were also early and relatively unproblematic adoptions both in the United States and UK – the ‘C’ of CT scans stands for ‘computerised’ and it quickly became clear that it was easier to make the whole digital file available than to decide which slice of a 3D image to print out on film. The need in the United States for accurate billing and coding drove a need for IT systems to support these US-specific initiatives such as the move to health maintenance organisations (HMO) and managed care in the 1980s. This was mirrored in the UK by the introduction of Resource Management Initiative (RMI)/Casemix systems to support the then-current political objectives about tracking and costing healthcare inputs and outputs.

Initial interest in creating artificial intelligence (AI) diagnostic systems (expert systems such as MYCIN and INTERNIST-I) did not seem to result in their widespread adoption outside academia.8 More modest initiatives such as Medline in 1965 aimed to support, not supplant, expert decision making with access to current evidence and, more recently, so-called decision support tools.

Starting from small beginnings as early as 1975, UK primary care transformed from a paper-based to a computerised system, meaning that there has been a comprehensive Electronic Health Record (EHR) at the point of care for some years now.Footnote * There are reliable mechanisms for transfer of records within primary care between practices when a patient moves and also in summary form to secondary care on referral or admission (the Summary Care Record). This is a little-known success story of UK health informatics and this contrast with the more varied history of secondary care IT deserves some examination.Reference Benson9

General practitioners (GPs) are, in effect, small business owners and they have a personal investment in their workflows and processes that may be lacking in secondary care. The GP workforce contained many enthusiastic early adopters of EHRs who, in some cases, wrote the software they used. This intimate involvement of domain specialists was lacking in secondary care and has highlighted a need for a hybrid clinician–informatician role (described in more detail in Chapter 6). There were also several supportive policy and incentive frameworks that helped drive primary care to adopt EHRs,Reference Benson10 including paying GPs a premium if they were able to deliver screening interventions to certain proportions of their caseload.

In the United States, concern over iatrogenic harm was an important driver of IT adoption under the reasonable assumption that IT systems could be used to reduce human error. However, the potential for IT systems to compromise patient privacy and confidentiality were identified early on and helped drive the 1996 US Health Insurance Portability and Accountability Act (HIPAA) requirements of confidentiality and legibility.Reference McLean3 IT systems were (and are) often seen as a way of squaring the circle of maintaining or improving quality in an era of increasing demand and reduced funding. Therefore, it is perhaps no surprise that the Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009 was part of the US response to the global financial crisis of 2008 that mandated,Reference McLean3 and more importantly funded (to the value of $30 billion),Reference Wachter11 the widespread adoption of EHRs in the United States. The return on this investment was an increase in EHR use from 10% to 75% across a wide range of healthcare providers from small clinics to large hospitals.

The lag between primary and secondary care in the adoption of IT (as early as 1991, more than 60% of UK GP practices had adopted EHRsReference Detmer, Wyatt and Buchan12) continued to concern governments in the UK and a range of initiatives and strategies were tried. The New Labour administration in the late 1990s saw a transformative potential in the creation of a single unified national EHR, leading to the National Program for IT (NPfIT) in 2002,Reference Justinia13 a major initiative that will be examined in the National Program for IT section.

The devolved administration in Scotland, along with Wales and Northern Ireland, chose to remain separate from this. In Scotland, a more gradualist approach was taken which involved tolerating a mix of systems whilst aiming to gradually integrate this over time using portal systems and gateways.

In the European Union (EU), the Commission’s Communication on the ‘digital transformation of health and care’ of April 2018 laid out the EU strategy on health informatics.14 It consists of three pillars covering secure data access and information sharing across the EU, the sharing of information to improve individual and collective healthcare through research and strengthening citizen empowerment through digital services.

Telepsychiatry (Also Known as Video Consultations)

Telepsychiatry is the use of electronic communication and information technologies to provide or support mental healthcare at a distance.15 As a relatively ‘hands-off’ speciality, it is a natural candidate for remote delivery and the first experiments along these lines started in the 1950s when the Nebraska Psychiatric Institute delivered interventions over an early (analogue) videoconferencing link.16 An important point is that the interventions included not just doctor–patient consultations but also professional–professional meetings and teaching sessions. This shows the importance of thinking laterally.

In the 1970s, a regular psychiatry clinic was delivered to Logan Airport health clinic by remote means.16 Australia, too, was an early adopter of telepsychiatry, perhaps due to that continent’s large geography.Reference Blainey17 Peter Yellowlees and others did important work demonstrating the value of telepsychiatry, including with underserved groups such as Indigenous Australians, who might not have seemed natural candidates for telepsychiatry.Reference Chan, Parish and Yellowlees18 He is the author of Chapter 7 in this book, where he explores telepsychiatry in much more detail.

The common perception remained, however, that telepsychiatry was ‘second best’ to face-to-face consultation and best suited to particular environments such as Australia or the polar regions. The Covid-19 pandemic has provided an enormous impetus to telepsychiatry with a very rapid pivot to remote consultations. The results of this natural experiment are awaited with interest. In the interim it behoves practitioners to familiarise themselves with formal good practice guidelines for their speciality and their jurisdiction. The Irish coalition group Mental Health Reform have helpfully produced a document summarising European, Irish, UK and US guidance.19

Box 1.1 details the chapter author’s personal experience with setting up a telepsychiatry service in a regional mental health unit.

Box 1.1 Personal experience with telepsychiatry

Telepsychiatry can be provided with good-quality video and audio and can cover almost all aspects of a mental health consultation, especially if there is support for the service user on hand.

In 2008–9, we were able to obtain an obsolete videoconferencing unit that was being discarded by the management suite. We later took advantage of a Scottish government initiative to have a second unit bought for us.

We initially envisaged ‘telepsychiatry’ as involving only doctor–patient contact. In 2017, however, the long-term sick leave of a colleague required us to find ways of getting more out of our reduced medical workforce. We therefore introduced a system of ‘virtual clinics’ where consultants provided support and advice to prison-based senior nurses (most with prescriber training). This was, despite the modest scope of the initiative, very popular and helped greatly reduce referrals to psychiatry from prison and the impact of our absent colleague. We ended the period of sick leave with a shorter waiting list than when we started!

The recent pandemic situation led to an abrupt adoption of doctor–patient telepsychiatry by many psychiatrists. The chapter author’s main experience of this is in his current role as a military psychiatrist in the Republic of Ireland and is generally positive. Military facilities are often very widely scattered and travel between them was more problematic with pandemic restrictions. The experience of telepsychiatry consultations in the pandemic needed to be compared not with ‘normal’ face-to-face interactions but with masked and socially distanced ones, and in this comparison they came out as superior. The main concern was if the person being examined was significantly distressed, but this was addressed by making sure they had support with them at their end.

In terms of practice points, the stability of the connection is very important. Sound quality is critical, perhaps more than video quality (beyond a certain minimum). It is important to consider the IT skills of the population you are serving, as well as their access to IT devices, high-speed broadband or even a private area in their home from which to connect. This issue came up in our work in the prisons, with remote units being vandalised or tampered with. Yellowlees even suggested the use of the patient’s car as a private secure space,Reference Duerr20 particularly when parked close enough to the home to get a Wi-Fi signal, and this indeed has proved useful on occasions.

One model that has proved successful is a hybrid one where a remote consultation supports a local practitioner such as a specialist nurse or a primary care practitioner who is physically present with the patient. This may represent a very workable middle point for many mental health consultations following the lifting of pandemic restrictions

Confidentiality and consent are particularly important in remote consultations and adherence to national/local policies and best practice guidelines as well as scrupulous documentation are highly recommended.

E-Learning

Just as e-commerce was prefigured by catalogue shopping; it could be argued that e-learning was foreshadowed by correspondence courses,21 or the Open University,22 which was founded in 1969. With the widespread availability of networked home computers, e-learning moved onto the Internet or internal corporate intranets.

Much like telepsychiatry, it is important to look at e-learning in a lateral way as encompassing a range of approaches from standalone ‘packages’ that need to be worked through to complex online environments which shape and facilitate educational interactions.

There is no learning currently that is not ‘e’ to some degree. Most face-to-face teaching involves scheduling, the distribution of materials by email and is often supported by an online resource. It is also the case that the open Internet contains a wide range of educational material, often curated and of high quality, such as Coursera or the Open University.23 However, there are excellent reasons to create a ‘walled garden’ online for students, particularly for highly regulated careers such as medicine, so virtual learning environments (VLEs) are required. Blackboard24 and Moodle25 are examples of this kind of software. Some of these offerings are open source, some commercial. Open source offers reassurances around the issue of interoperability and vendor lock-in (being tied to an expensive system purchased some time ago) that has parallels in healthcare IT.

The functionality of VLEs varies but typically includes support for administrative functions such as access control, staff contact lists, syllabi, timetables, scheduling and reading lists. There are typically libraries for content such as lectures, presentations and study notes. Increasingly, learning is seen as an active process of engagement rather than passive rote learning, so features such as bulletin boards, chat and collaborative editing environments are included. Tests and quizzes can be built into a VLE, including for student feedback. In work-based VLEs such as those used by many healthcare organisations for mandatory training modules, a passing grade on these tests can provide confirmation of adherence to mandatory training. Finally, the creation of persistent records of learning, such as portfolios, may support assessment and regulatory needs.Reference Ellaway and Masters26

Separate but connected to VLEs is the topic of simulation. Simulations in medical training can be quite low-tech, such as banana-skin suturing, or as high-tech as robotic patients for intubation and resuscitation. These manual skills are relatively rarely needed in mental health settings but are important to acquire, so experience of a medical emergency in a simulated setting can be valuable learning. Simpler, narration-based tools can be used to create simulations of clinical situations in psychiatry without the hardware or programming overheads of complex graphics. One interesting option is the work by O’Shea, Lenihan and Semple using an interactive fiction design tool called Twine to create branching clinical scenarios with if/then logic and scores.Reference O’Shea, Lenihan and Semple27

A recurring theme in this chapter is the importance of focusing on the end goal, not the technology used to achieve it, and how the introduction of IT should not be primarily for reasons of convenience or finance. These lessons apply to e-learning as much as to EHRs or telepsychiatry.

There are technical and legal issues around creating content for online learning, which can be minimised by using the tools, personnel and resources provided by your institution. There are also issues relating to file size and format that may inconvenience your users. Be particularly careful not to include the copyrighted material of others without appropriate permissions. Even when material may be freely reused, there remains the scholarly duty of appropriate recognition and attribution. Licensing affects your own creative work too. Depending on local institutional arrangements you may, or may not, be free to distribute your work online. Even if you favour the most ‘generous’ sharing policy, it may still be wise to exert some form of control using Creative Commons tools.28

There are ethical and (depending on your jurisdiction) legal requirements to ensure that your teaching materials and tools are accessible to a neurodiverse student population and to students with other disabilities such as visual impairments. Most of these principles represent common-sense good design in any case, such as unique titles for slides, good contrast and minimal visual clutter on slides. A useful summary is provided by the ‘Full Fabric’ blog.29

Past Failures

This is a book about the future of digital mental health, but we do need to look at and learn from the past or we will be doomed to repeat its mistakes.Footnote

Problems with Public Sector IT Procurement

Whilst public sector IT procurement failures attract much attention, it is probably worth noting at the outset that there may be a publication bias at work as private corporations do not have the same requirements for transparency and accountability,30 and may thus be better placed to quietly bury their failures. A Computerworld article from early 2020 lists some recent examples from the private sector.Reference Wayner31

Other governments and organisations are however not immune to these difficulties. According to a 2003 report, in the period between 2003 and 2012, only a distinct minority of large US IT projects were successful.32

Problems with Electronic Health Records

In the United States, EHR systems typically emerged as add-ons to existing customer billing/finance systems and were not optimised for clinical workflows; it is no surprise therefore that EHR software was cited as a cause in more than 50% of US cases of physician burnout. They have been accused of making it too easy for third parties to request ‘just one more’ item of information, leading physicians and other healthcare professionals to become data-entry clerks rather than health professionals. They also lead to perverse incentives, with high percentages of patients having enough symptoms to count as complex because physicians get paid more for complex cases. The story of EHR development in the United States is told by Bob Wachter in his light-hearted book The Digital Doctor.Reference Wachter33

One UK example is the Casemix Information Systems and Resource Management Initiative (CMIS/RMI). These were programs introduced in the mid-1980s which were intended to have a strategic and integrative function, sitting in the centre of existing clinical and managerial IT systems. The goal was to have the ability to establish the notional costs of each episode of patient care, which in aggregate created the ‘casemix’ of the hospital. Clinical activity (consultations, investigations, etc.) were recorded along with financial data and used to compare clinical and financial performance with a standard or idealised case. The lack of perceived clinical benefit and buy-in contributed to the eventual failure of the program, though elements have been subsumed into later systems.

The National Program for IT

Background

The National Program for IT (NPfIT) ran between 2002 and 2011. During its period of operation, it is reported to have cost £12.4 billion. Connecting for Health (CfH) was the arm’s-length body charged with delivering NPfIT and for most of its active existence (until 2008) it was led by Richard Granger, a former civil servant and management consultant with experience in large-scale IT procurement.

It is important to understand that, whilst the end goal of the initiative was to deliver a comprehensive national EHR, this involved creating several foundation components, including the network itself, electronic mail, the ‘spine’, a range of national applications like ‘Choose and Book’, as well as the picture archiving and communication system (PACS). It was a huge task.

The approach taken was to divide England into five geographical ‘clusters’ in each of which a local service provider was contracted to deliver the agreed product as a local monopoly. Steep financial penalties were agreed for non-performance of the contracts. Under these conditions, procurement proceeded quickly, and this was positively commented on. Over time, however, this adversarial approach drained goodwill from the enterprise and once technical problems and delays emerged, inevitable in a project of this size and complexity, relationships soured. There was a high turnover of experienced staff with consequent loss of corporate knowledge. Key deadlines were missed and supplied systems were sometimes not sufficiently reliable. There were complaints that NHS staff had not been engaged as stakeholders, together with conflict with the public and professional bodies about the consent model for sharing information between different systems.

By 2009, NPfIT had not come close to delivering the integrated national EHR that had been envisaged, and following criticism from the Public Accounts Committee in January 2009, the incoming coalition government terminated the project in 2011.

Successes

It is important to mention the significant successes from this program. These include important infrastructure such as the improved NHS network with a central ‘spine’ for storing and exchanging information, an electronic prescribing system, the PACS image system, the Summary Care Record, the NHS mail staff email system and the Choose and Book appointment system. These are still in use today.

Failures

The Wachter report of 2016 identified a number of factors for the failure of NPfIT.Reference Wachter34 The most basic, according to the report and other authors, such as Dolfing,Reference Dolfing35 CoieraReference Coiera36 and Robertson and colleagues,Reference Robertson, Bates and Sheikh37 was simply scale. There was no ‘plan B’, no ability to gracefully degrade to a down-scoped version of the program. Another key issue according to the report and other sources was the ambitious, perhaps unrealistic, timetable. Other problems included the excessively centralised approach to contracting and procurement and the excessively confrontational nature of these processes, which served to alienate the suppliers and lead to them charging heavily for modifications to poorly written specifications.

This led to problems downstream where professional groups and other stakeholders delayed progress and where vague specifications led to legal disputes and unclear project scope. Pressure of time also contributed to a lack of testing with resulting quality control issues, delays and the reliability issues mentioned above.

Wachter and others noted the lack of clinical engagement with the project and highlighted that this needed to be something that happens at the beginning of a project, not the end. A lack of expertise, both in ‘pure’ IT skills and also in the critical IT/clinician crossover area, was identified.

Lessons Learned

The large-scale strategic learnings from NPfIT are representative of other large-scale procurements. These include the need to recognise the complexity of the problem and the need for ‘adaptive change’ (where the whole system needs to change as well as the IT system). Also, due to the lag between implementation and realisation of benefits, organisations need to respect this and understand that the return on investment may not be financial. Crucially, technical support and a willingness to adjust the solution should extend past the implementation, through the inevitable teething problems and on through the adaptive change process.

Motives are important. Digitisation/computerisation should happen for clinical reasons, not IT, financial or managerial reasons. The clinical workforce needs to believe this (that they are not having a management system foisted on them) and so be on board for the inevitable pain and disruption this kind of adaptive change involves. There needs to be efforts made to upskill the workforce and to foster the hybrid clinician/informatician role as the necessary process re-engineering involved in adaptive change requires deep domain knowledge combined with an understanding of the technology. Wachter recommended the creation of Chief Clinical Information Officers (CCIOs) at both local and national levels to support this.

Wachter warned against dramatic crash programmes. He recommended sensitivity to the level of readiness of the organisation with different combinations of incentives and regulation over time for organisations in order to bring them all to a sufficient position of ‘digital maturity’.

Also, Wachter cautioned about over-generalising the lessons of NPfIT and completely rejecting centralisation. He recommended a balance between centralisation and devolved implementation. Large-scale EHR systems have enormous potential for healthcare research and public health so there needs to be balance in the area of privacy and also a clear external regulatory framework and mechanisms to support information interchange.

More specific learnings include the importance of interoperability and the need to establish this at the outset. Without this, the risk of ‘vendor lock’, dependence on a single supplier, is too high. Interoperability and open standards also foster an ecosystem of innovation. Interoperability can mean the centralised mandating of how systems share information but allow different parts of the system to use different products that suit their particular needs.

Complexity can be tamed by modularisation and staging of projects and Wachter recommended aiming for initial ‘quick wins’ to build momentum and enhance morale. He also touched on user engagement and the usability of the user interface/user experience (UI/UX) at the design stage, informed by an understanding of the cognitive demands on clinicians. Gaining this understanding will require ethnographic approaches to UI/UX design. This is covered more in Chapter 4 on EHRs and digital note keeping.

The good news is that implementing an EHR can be done better and Box 1.2 contains the author’s personal experience with a full EHR transition which, whilst not perfect, did illustrate some of the points above.

Box 1.2 Personal experience with EHR transition

The author’s experience stems from working in a regional medium secure unit in Scotland. This unit, opening in 2000, presciently ordered an early version of an EHR from a UK supplier so that it was operating in a paperless/paper-light mode from the outset. Over time, the application proved popular and useful. The EHR system was not without its flaws but professionals from all disciplines became deeply familiar with it and learned to work around many of these. There had, however, been little planning around long-term support, succession or data portability.

By 2016, withdrawal of support by Microsoft for their older operating systems, increasing worries about compatibility going forward, along with obsolescence of the EHR software itself, were causing concern. Around this time, the wider organisation’s mental health services were in the process of transitioning to a very large and widely deployed EHR system used by the physical health part of the organisation. A decision was made to volunteer the medium secure unit to be an early adopter for mental health of the new EHR.

This was not a universally popular decision. The rationale and need for the transition were not apparent to everyone and perhaps had not been communicated effectively. The new EHR, by virtue of being used across multiple secondary care settings, was not tuned to any one specialty and did not support some beloved features despite attempts to make compensatory modifications. The interface could also be best described as ‘functional’. However, staff were trained and on-site support with clinical leadership was available.

There were, however, some major advantages. As a critical part of NHS Lothian’s infrastructure, we could be confident that the whole organisation would support our EHR going forward rather than the responsibility resting with a small forensic clinic. Similarly, the financial costs lay with the organisation as a whole rather than a small part of it. Doctors in training and other short-term staff could be quickly effective as it was the same EHR they were used to using elsewhere. The new system gave us access to medical and surgical notes, outpatient appointments and to radiology and laboratory results. Our patients being looked after in general medical/surgical settings could get better care due to there being joined-up record keeping. Perhaps most importantly for these patients with high needs and risks, all records were in one place, making it less likely that a vital piece of information was missed when compiling risk assessments or other documents.

In addition to all the useful points listed in Box 1.2, our experience would suggest that developments have a natural time and pace and that there is no point forcing a development before this natural time. The corollary to this is the importance of recognising and taking advantage of the right time when it comes. Crises often drive developments, and notwithstanding the difficulties of crash projects, it can be a good idea to have draft plans drawn up and ready for implementation when the right moment arrives.

Wachter was undoubtedly correct when he highlighted the importance of having health professionals involved in IT projects. However, IT professionals often fail to understand just how variegated and diverse modern healthcare professions are and may regard a generic clinician such as a surgeon a satisfactory representative of all clinical professionals and specialties. Hence it needs to be clinicians plural involved and not just one lone worker.

Current and Near-Future Issues

UK Structures

In the UK, the Wachter report called for a ‘5-year-forward view’ with ubiquitous EHRs and e-prescribing.Reference Wachter34 There was a recognition of the varying degrees of digital maturity across England and Wales and a focus on interoperability and personalised healthcare.

In 2019, NHSX was created to coordinate IT delivery across health and social care across England and Wales. It has had a strategic role, setting policy and standards with actual implementation being commissioned by NHS Digital. In 2022, it was integrated into the NHS Transformation Directorate but progress had been made with a more helpful level of centralisation, the ongoing creation of regional integrated care services with a requirement to share data digitally across health and social care and better clarity of ‘what good looks like’ and ‘who pays for what’.Reference Gould38

National structures will undoubtedly continue to evolve. Digital departments have definitely arrived – the challenge now is to embed it as ‘business as usual’ so we don’t need the sub-focus of digital at all.

Pandemic Response

At the time of writing (Spring 2022), we are two years into the global Covid-19 pandemic but there are signs of more normality returning. However, ongoing periodic waves of infections including the annual winter bed crisis mean that we need to continue to implement the lessons learned during this stressful time.

Telepsychiatry and tele-mental health therefore will be an obvious IT application for the immediate future. Ease of use will be improved, as will integration with EHR systems. We will have to give more thought to the ‘distal’, or patient, end of the telepsychiatry link with ruggedised or tamperproof systems for certain environments. The status of telepsychiatric assessments with respect to statutory processes such as detention will need to be clarified, having already been challenged in court. It is likely that telepsychiatry will be part of a spectrum, or menu, of options in the post-pandemic situation, particularly for military services or for remote areas or areas where access is difficult (e.g. prisons, ships).

The adoption of EHRs is likely to accelerate also due to the pandemic because of their ability to support remote working, easier integration with telepsychiatry, reduction in travel and thus reduced movement of potential infected persons and fomites.

Resilience and Disaster Management

One of the lessons of the pandemic is, or ought to be, an increased sense of the vulnerability of our globally interconnected world. Going forward, more will be asked of healthcare IT in terms of resilience to, for example, power outages, network problems, cyberattacks and extreme weather events. In addition, IT systems will be expected to display ‘graceful degradation’ under these conditions with backups available rather than abrupt collapse. Ensuring this happens will become an important part of the specification process.

There are increasing concerns regarding foreign suppliers for key items of infrastructure including important public sector IT hardware and software. This is on the basis of supply chain reliability but also due to security concerns, as we saw with recent disputes over the Chinese company Huawei and 5G networks. For example, both the United States and the EU are ‘onshoring’ microprocessor production. Going forward, bidding for strategic IT investments such as large public healthcare systems may be confined to trusted partner companies based in friendly or allied nations. Shrinking the pool of available suppliers will inevitably lead to reductions in choice and increases in cost. On a more positive note, closer working relationships with a small number of trusted partners may help reduce the adversarial procurement practices identified as a factor in the NPfIT debacle.

Mobility and Remote Access

It has been noted that the interposition of a computer screen between clinician and patient can significantly degrade the quality of clinical interactions and contribute to clinician burnout. Breaking the umbilical cord and allowing the EHR to be read to, and written from, on a more flexible, paper-like device has long been of interest to IT companies. Long before the iPad launched in 2010, Microsoft pushed the concept of the ‘tablet PC’ for many lonely years with a clear emphasis on so-called ‘vertical’ markets including healthcare. It may have been that, in classic chicken and egg fashion, EHRs were not yet ubiquitous enough to justify the use of expensive (as they were then) specialist tablet hardware. The profusion of tablet-style devices since then and the widespread use of EHR systems perhaps makes this option worth revisiting. A flat tablet-style device using well-designed menus and dropdowns, along with handwriting and voice recognition, could allow real-time access to EHR systems along with preserved patient rapport. Such a device, along with a lightweight portable keyboard and cellular networking, could act as an EHR terminal, a telepsychiatry workstation and a portable office and might be all the IT infrastructure many mobile mental health workers will need.

Carbon Footprint

Environmental concerns have taken a back seat in global conversations over the last year, but the underlying problems have not gone away. There are simple and direct environmental advantages of many health informatic developments, including reductions in travel time and the transport and storage of physical records, both energy-consuming activities.

Data centres, however, are a major consumer of energy and electronic hardware raises important environmental questions from both raw material mining and waste disposal perspectives. Recent initiatives such as mandatory energy efficiency, new designs for data centres and ‘right to repair’ laws may help tilt the balance towards the positive.

It is likely that those planning the introduction of ambitious IT projects will have to include in their business cases a detailed accounting of these costs and benefits.

Manpower Issues

Specialisation and division of labour have long been seen as an important driver of improved quality and reduced costs.Reference West39 However, if specialist services see a high number of particular conditions or carry out a high number of specific procedures to get their better outcomes, it is difficult to see how this increasing division of work can be safely coordinated without tools like shared EHRs and computer-assisted workflows.

An ageing population exacerbates this problem with more episodes of care, spread over more specialties and sites, more professionals involved and more medications being prescribed. Managing this ‘multimorbidity’ will be very difficult without the tools described in this book. As well as being more specialised, the healthcare workforce is becoming more diverse and working shorter hours, so systems will be designed to support handovers, briefings, huddles and the rapid acquisition of situational awareness.

There are shortages in several categories of healthcare worker and, for some underserved areas and populations, telepsychiatry, e-learning and decision support systems will be needed to augment the limited professional workforce. Integrating these developments in a safe and defensible way will be a major challenge for the years to come.

Future Prospects

The aphorism that ‘it is difficult to make predictions, particularly about the future’ has been attributed to various sages. Regardless of the provenance, however, it remains a good guide when making predictions about the future. The author edited a book some years ago which was bold enough to include a list of predictions.Reference Lenihan40 The reader may wish to consult these and compare them with today’s reality before deciding how much credence to give to the further prognostications below!

Infrastructure

It seems a safe place to start to predict that EHR systems will become universal with paper-based records only used for retrospective case review. These will increasingly be accessed by mobile devices in a range of form factors rather than fixed terminals.

A range of factors are pushing towards ubiquitous, wide-area, high-speed mobile networking to the point that many people may have only a ‘mobile’ connection at home and the current distinction between local Wi-Fi and wide-area 3G/4G/5G networks will blur.

With mass production, and economies of scale, cameras and sensors are now cheap and almost disposable. Cost pressures, along with liability and governance issues, will act as a driver for these to be incorporated into care environments, for example, as sensors for non-contact respiration monitoring in seclusion rooms, medication compliance sensors on pill boxes, or for detecting falls in the elderly. Many of these sensors will feed their data directly into EHR systems, reducing the need for manual transcription but increasing the volume of raw data that clinicians will have to contend with. There will be complex ethical, privacy, legal and clinical trade-offs to navigate.

Not all monitoring will be non-consensual or covert. One of the genuinely novel and unexpected trends since our book has been the degree to which people are willing to monitor themselves and to give up large volumes of personal data for minimal or no reward.Reference Lenihan40 We are all familiar with the patient who brings along voluminous diaries and other writings to be reviewed in a finite appointment time. The proliferation of mood diary apps, wearable devices, trackers and home diagnostic systems will increase this ‘co-creation’ of medical information and our systems will need to learn to store, and render meaningful, these new information sources.

Analysis

Proceeding from the preceding point, these large datasets (big data) will need automated processing and analysis if they are to generate meaning. We can envisage research and audit moving from something that happens retrospectively to something that happens continuously in real time with tight loops between practice and evaluation.

At the time of writing, AI is going through one of the regular peaks of hype to which it is prone.41 It is, of course, not a singular thing and the AI term encompasses several different approaches. Many of these approaches are not algorithmic or rule based but use neural networks, statistical models or genetic algorithms. These tools are not based on explicit representation of the problem domain or the solution and, even if accurate, are not transparent or explainable. This can pose problems when the AI generates categories which violate social norms,Reference Buranyi42 or when decision-making transparency is legally required. ‘Explainable’ AI is at an early stage but would seem like an important initiative.43 Chapters 4 and 5 of this book focus on big data and AI along with the associated ethical concerns.

Standards and Security

We have seen above how interoperability of EHR systems is already important and will become only more so if we are to avoid vendor lock-in and facilitate modularisation. The best way to achieve durable and resilient interoperability is for vendors to ensure that their systems adhere to open, published standards. Proprietary standards, which can be manipulated by market players, are to be avoided if possible.

Another major issue will be security. The concept of ‘hybrid warfare’ blurs the state/non-state actor distinction by incorporating the use of deniable assets to disrupt adversary IT systems. The migration of critical national infrastructure like EHR systems online opens up the possibility of mass casualties from such attacks. We saw a ransomware attack on NHS systems (Wannacry) in 2017 and this will not be the last. There may be a need for binding international agreements making attacks on health infrastructure a war crime in the same way as attacks on dams. In the interim, end users and the public will need to accept significant convenience trade-offs in order to maintain security.

Conclusions

The scope of this chapter was broad and could easily have justified an entire book of its own. I hope, however, that it, and the additional resources referenced, will set the scene for some of the more specialist chapters that follow and allow the reader to ask some sensible initial questions if they are asked to become involved in a digital project. Some of the most key lessons of the NPfIT, from both Wachter and others, are summarised in Table 1.1.

Table 1.1 Key lessons – risks and controls in health IT projects

RisksControls
Failure to appreciate complexity, scale and interrelatedness of the proposed project.Consider breaking project down into more manageable sub-projects, modules or stages.
Failure to achieve buy-in from end users and other stakeholders.Stakeholder engagement.
Failure to change the skill mix of users.Training and recruitment.
Underbudgeting.Realistic budgeting including implementation, training and customisation costs.
Adversarial procurement processes leading to loss of good will.Better management of supplier relationships.
Scope creep – the addition of new features mid-project.Discipline and management of stakeholders. Ability to say no.
Failure to re-engineer workflows to suit the new IT system.Don’t ‘put lipstick on a pig’, that is, blindly replicate an old, suboptimal or purely paper-based process in new software.

Footnotes

* Various terms exist for this – we have gone with ‘Electronic Health Record’ or EHR. Related terms are Electronic Patient Record, Digital Health Record or Electronic Notes. These are discussed in more detail in Chapter 3.

Attributed to Gregory Santayana (1863–1952).

References

Krishnan, M., Mischke, J., Remes, J. Is the Solow Paradox back? McKinsey Quarterly, 4 June 2018. Available at: www.mckinsey.com/capabilities/mckinsey-digital/our-insights/is-the-solow-paradox-back#/ (accessed 20 June 2023).Google Scholar
Nissen, T., Wynn, R. The history of the case report: a selective review. JRSM Open. 2014;5(4): 205427041452341. https://doi.org/10.1177/2054270414523410.CrossRefGoogle ScholarPubMed
McLean, K. Intro to history of health informatics. YouTube. 26 January 2013. Available at: www.youtube.com/watch?v=AzLy58dhsXc (accessed 8 January 2022).Google Scholar
Morabia, A. Observations made upon the Bills of Mortality. BMJ. 2013;346: e8640. https://doi.org/10.1136/bmj.e8640.CrossRefGoogle Scholar
Choi, B. C. The past, present, and future of public health surveillance. Scientifica. 2012: 126. https://doi.org/10.6064/2012/875253CrossRefGoogle Scholar
Neuhauser, D. Florence Nightingale gets no respect: as a statistician that is. Qual. Saf. Health Care. 2003;12(4): 317. https://doi.org/10.1136/qhc.12.4.317.CrossRefGoogle Scholar
Gupta, S. Florence Nightingale understood the power of visualizing science. Science News. 13 May 2020. Available at: www.sciencenews.org/article/florence-nightingale-birthday-power-visualizing-science (accessed 8 January 2022).Google Scholar
Clinfowiki. Medical informatics history. Clinfowiki. 2011. Available at: www.clinfowiki.org/wiki/index.php/Medical_informatics_history (accessed 8 January 2022).Google Scholar
Benson, T. Why general practitioners use computers and hospital doctors do not – part 2: scalability. BMJ. 2002;325(7372): 1090–3. https://doi.org/10.1136/bmj.325.7372.1090.Google ScholarPubMed
Benson, T. Why general practitioners use computers and hospital doctors do not – part 1: incentives. BMJ. 2002;325(7372): 1086–9. https://doi.org/10.1136/bmj.325.7372.1086.Google ScholarPubMed
Wachter, R. M. Making IT Work: Harnessing the Power of Health Information: Technology to Improve Care in England. Report of the National Advisory Group on Health Information Technology in England. Department of Health. 2016. p. 19.Google Scholar
Detmer, D., Wyatt, J., Buchan, I. National-scale clinical information exchange in the United Kingdom: lessons for the United States. JAMIA. 2011;18: 91–8. https://doi.org/10.1136/jamia.2010.005611.Google Scholar
Justinia, T. The UK’s National Programme for IT: why was it dismantled? Health Serv. Manage. Res. 2016;30(1): 29. https://doi.org/10.1177/0951484816662492.CrossRefGoogle ScholarPubMed
European Commission. Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions on enabling the digital transformation of health and care in the digital single market; empowering citizens and building a healthier society. Brussels: European Commission. 2018. Available at: https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018DC0233 (accessed 16 June 2023).Google Scholar
American Psychiatric Association (APA). Telepsychiatry via Videoconferencing: Resource Document. Washington, DC: APA. 1998. Available at: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.173.2939&rep=rep1&type=pdf (accessed 9 January 2022).Google Scholar
American Psychiatric Association (APA). History of telepsychiatry. Vimeo. 17 February 2016. Available at: https://vimeo.com/155763287 (accessed 9 January 2022).Google Scholar
Blainey, G. The Tyranny of Distance: How Distance Shaped Australia’s History. Pan Macmillan. 2010.Google Scholar
Chan, S., Parish, M., Yellowlees, P. Telepsychiatry today. Curr. Psychiatry Rep. 2015;17(11): 89. https://doi.org/10.1007/s11920-015-0630-9.CrossRefGoogle ScholarPubMed
Mental Health Reform. Rapid Briefing for the COVID-19 Crisis. 14 April 2020. Available at: www.mentalhealthreform.ie/wp-content/uploads/2020/04/eMEN-rapid-briefing-paper_-COVID-19_final-12.pdf (accessed 9 January 2022).Google Scholar
Duerr, H. A. Making telepsychiatry work for you and your patients. Psychiatric Times. 22 May 2020. Available at: www.psychiatrictimes.com/view/making-telepsychiatry-work-you-and-your-patients (accessed 9 January 2022).Google Scholar
TalentLMS. The evolution and history of e-learning. n.d. Available at: www.talentlms.com/elearning/history-of-elearning (accessed 9 January 2022).Google Scholar
The Open University. The Open University. n.d. Available at: www.open.ac.uk/ (accessed 9 January 2022).Google Scholar
Coursera.n.d. Available at: www.coursera.org/ (accessed 9 January 2022).Google Scholar
Blackboard. n.d. Available at: www.blackboard.com/en-eu (accessed 9 January 2022).Google Scholar
Moodle. Moodle – open-source learning platform. n.d. Available at: https://moodle.org/ (accessed 9 January 2022).Google Scholar
Ellaway, R., Masters, K. E-Learning in Medical Education. Dundee: Association for Medical Education in Europe. 2008.Google ScholarPubMed
O’Shea, C., Lenihan, F, Semple, S. Abstract number: 39. Dungeons, dragons, and forensic psychiatry: improving induction for new trainees through text-based adventure games. (Conference poster). Available at: www.rcpsych.ac.uk/docs/default-source/events/competing-interests/conference-book–poster-abstracts.pdf?sfvrsn=d8734492_2 (accessed 9 January 2022).Google Scholar
Creative Commons. Marking your work with a CC license. n.d. Available at: https://wiki.creativecommons.org/wiki/Marking_your_work_with_a_CC_license (accessed 9 January 2022).Google Scholar
Full Fabric. How to design visual learning resources for neurodiverse students. n.d. Available at: www.fullfabric.com/articles/how-to-design-visual-learning-resources-for-neurodiverse-students (accessed 9 January 2022).Google Scholar
Parliamentary Office of Science and Technology (POST). Government IT Projects. London: POST, p. 4. 2003. Available at: www.parliament.uk/globalassets/documents/post/pr200.pdf (accessed 9 January 2022).Google Scholar
Wayner, P. The biggest software failures in recent history. Computerworld. 17 February 2020. Available at: www.computerworld.com/article/3412197/top-software-failures-in-recent-history.html (accessed 9 January 2022).Google Scholar
Parliamentary Office of Science and Technology. Government IT Projects. London: Crown, pp. 34. 2003. Available at: www.parliament.uk/globalassets/documents/post/pr200.pdf (accessed 9 January 2022).Google Scholar
Wachter, R. The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age. New York: McGraw Hill. 2015.Google Scholar
Wachter, R. M. Making IT Work: Harnessing the Power of Health Information: Technology to Improve Care in England. Report of the National Advisory Group on Health Information Technology in England. Department of Health. 2016.Google Scholar
Dolfing, H. Case study 1: the £10 billion IT disaster at the NHS. Henrico Dolfing – Interim Management and Project Recovery. 20 January 2019. Available at: www.henricodolfing.com/2019/01/case-study-10-billion-it-disaster.html (accessed 9 January 2022).Google Scholar
Coiera, E. W. (2007, January 1). Lessons from the NHS National Programme for IT. Med. J. Aust. 2007;186(1): 34. https://doi.org/10.5694/j.1326-5377.2007.tb00774.x.CrossRefGoogle Scholar
Robertson, A., Bates, D. W., Sheikh, A. The rise and fall of England’s National Programme for IT. J. R. Soc. Med. 2011;104(11): 434–5. https://doi.org/10.1258/jrsm.2011.11k039.CrossRefGoogle Scholar
Gould, M. NHSX moves on. Blog on the NHS Transformation Directorate website. 2022. Available at: https://transform.england.nhs.uk/blogs/nhsx-moves-on/ (accessed 6 October 2022).Google Scholar
West, E. G. Adam Smith’s two views on the division of labour. Economica 1964;31(121): 2332. https://doi.org/10.2307/2550924.CrossRefGoogle Scholar
Lenihan, F. Computers in Psychiatry. 1st ed. London: Gaskell. 2006.Google Scholar
History of Data Science. AI winter: the highs and lows of artificial intelligence. 24 November 2021. Available at: www.historyofdatascience.com/ai-winter-the-highs-and-lows-of-artificial-intelligence/ (accessed 9 January 2022).Google Scholar
Buranyi, S. Rise of the racist robots: how AI is learning all our worst impulses. The Guardian, 8 August 2017. Available at: www.theguardian.com/inequality/2017/aug/08/rise-of-the-racist-robots-how-ai-is-learning-all-our-worst-impulses (accessed 9 January 2022).Google Scholar
The Royal Society. Explainable AI: The Basics. 2019. Available at: https://royalsociety.org/-/media/policy/projects/explainable-ai/AI-and-interpretability-policy-briefing.pdf (accessed 9 January 2022).Google Scholar
Figure 0

Table 1.1 Key lessons – risks and controls in health IT projects

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×