Hostname: page-component-848d4c4894-xfwgj Total loading time: 0 Render date: 2024-07-05T10:56:35.590Z Has data issue: false hasContentIssue false

Re-engineering justice? Robot judges, computerised courts and (semi) automated legal decision-making

Published online by Cambridge University Press:  04 July 2019

John Morison*
Affiliation:
School of Law, Queen's University Belfast, Belfast, UK
Adam Harkens
Affiliation:
School of Law, Queen's University Belfast, Belfast, UK
*
*Corresponding author email: [email protected]

Abstract

This paper takes a sceptical look at the possibility of advanced computer technology replacing judges. Looking first at the example of alternative dispute resolution, where considerable progress has been made in developing tools to assist parties to come to agreement, attention then shifts to evaluating a number of other algorithmic instruments in a criminal justice context. The possibility of human judges being fully replaced within the courtroom strictu sensu is examined, and the various elements of the judicial role that need to be reproduced are considered. Drawing upon understandings of the legal process as an essentially socially determined activity, the paper sounds a note of caution about the capacity of algorithmic approaches to ever fully penetrate this socio-legal milieu and reproduce the activity of judging, properly understood. Finally, the possibilities and dangers of semi-automated justice are reviewed. The risks of seeing this approach as avoiding the recognised problems of fully automated decision-making are highlighted, and attention is directed towards the problems that remain when an algorithmic frame of reference is admitted into the human process of judging.

Type
Research Article
Copyright
Copyright © The Society of Legal Scholars 2019 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Thanks to the two anonymous reviewers, Mary Dobbs, Anthony Behan, Amnon Reichman, Rónán Kennedy, Jennifer Cobbe and Daithí Mac Síthigh for helpful comments, and to the participants in the Bench-QUB Law School Symposium held on 15 June 2018. John Morison would also like to acknowledge support from the ESRC award ref ES/I032630/1 and Adam Harkens thanks the Leverhulme Interdisciplinary Network on Cybersecurity (LINCS) which provided support for his PhD studies.

References

1 Following the Report on the Future of Work Commission (2017), available at http://www.futureofworkcommission.com (last accessed 27 May 2019) we define ICT (information and communication technology) in this context broadly to include robotics, artificial intelligence, and machine learning, the internet, big data analysis, the internet of things, digital technologies; combining and applying these technologies in diverse ways; and also to the collection of techniques, skills, processes and knowledge used by humans in relation to these technologies.

2 See further Brynjolfsson, E and McAfee, A The Second Machine Age: Work, Progress and Prosperity in a Time of Brilliant Technologies (New York: Norton, 2014)Google Scholar: Greenfield, A Radical Technologies: The Design of Everyday Life (London: Verso, 2017)Google Scholar; West, DM The Future of Work: Robots, AI, and Automation (Washington DC: Brookings Institute Press, 2018)Google Scholar.

3 See for example The In-House Counsel's LegalTech Buyer's Guide 2018, available at https://www.lawgeex.com/buyersguide/, which list over 100 technology solutions in what has been estimated to be a $16 billon market in the USA alone. See also Susskind, R Tomorrow's Lawyers: An Introduction to Your Future (Oxford: Oxford University Press, 2nd edn, 2017)Google Scholar and Segal, PLegal jobs in the age of artificial intelligence: moving from today's limited universe of data toward the great beyond’ (2018) 5(1) Savannah Law Review 211Google Scholar and Hartugn, M et al. Legal Tech: A Practitioner's Guide (Oxford: Hart Publishing, 2018)Google Scholar.

4 C Frey and M Osbourne The Future of Employment: How Susceptible are Jobs to Computerisation? (2013), available at https://www.oxfordmartin.ox.ac.uk/downloads/academic/The_Future_of_Employment.pdf (last accessed 27 May 2019).

5 See for example the work by Dunn et al reported in the Proceedings of the 16th Edition of the International Conference on Artificial Intelligence (2017) at pp 233–236, which looks at disparities among human asylum adjudication judges in the USA, and goes on to develop a predictive model to help applicants understand with 80% accuracy how external factors – nationality, language, hearing location, judge, etc – may affect an application. See also D Chen et al ‘Early predictability of asylum court decisions’ TSE Working Papers 17-781 (2017).

6 See for example Ashley, K Artificial Intelligence and Legal Analytics: New Tools for Law Practice in the Digital Age (Cambridge: Cambridge University Press, 2017)CrossRefGoogle Scholar; and Bex, F et al. (eds) ‘Special issue: artificial intelligence for justice’ (2017) 25(1) Artificial Intelligence and LawCrossRefGoogle Scholar.

7 This command theory of law is most usually associated with John Austin's writings: see Austin, J The Province of Jurisprudence Determined (1832, Rumble, W (ed)) (Cambridge: Cambridge University Press, 1995)CrossRefGoogle Scholar.

8 Ross, H Law as a Social Institution (Oxford: Hart Publishing, 2001) p 108Google Scholar.

9 Ibid, pp 76–83; T Parsons The Social System (New York: The Free Press, 1951) p 25. This is, therefore, not an argument regarding the calculative abilities of algorithms, their efficiencies, or their ability to solve even the most complex mathematical problems, but rather that there are complex social conflicts essential to the practice of law which are outside of the scope of such tools – including but not limited to, their ‘appropriate’ and ‘correct’ use, and contestations of their functions. For information on mathematical complexity and complexity theory see Papadimitriou, CHComputational complexity’ in Ralston, A et al. (eds) Encyclopedia of Computer Science (London: Wiley-Blackwell, 4th edn, 2000) pp 260265Google Scholar.

10 See Foucault, M The History of Sexuality Volume 1 (London: Penguin, 1990) p 95Google Scholar. For Foucault, power does not just react to resistance, nor is it merely preceded by it: resistive tensions constitute power, and lie at its very centre. This is a view of the legal process that we can share.

11 Golder, B and Fitzpatrick, P Foucault's Law (Abingdon: Routledge, 2009) pp 2, 79, 83CrossRefGoogle Scholar; Hunt, A and Wickham, G Foucault and Law: Towards a Sociology of Law as Governance (London: Pluto Press, 1994) p 104Google Scholar.

12 We are using the term ‘algorithm’ in this context to refer not only to a mathematical construct with ‘a finite, abstract, effective, compound control structure, imperatively given, accomplishing a given purpose under given provisions’ (Hill, RWhat an algorithm is’ (2015) 29(1) Philosophy & Technology 35CrossRefGoogle Scholar at 58), but also to encompass a machine learning element, and the lay sense of the term which includes implementation of the mathematical construct into a technology, and an application of the technology configured for a particular task in a social context. See further Beer, DThe social power of algorithms’ (2017) 20(1) Information, Communication & Society 1CrossRefGoogle Scholar.

13 Parts of this section draw upon the authors’ contribution to M Moscati et al (eds) Comparative Dispute Resolution Handbook (Cheltenham: Edward Elgar, forthcoming).

14 For example, for the UK see Lord Woolf Access to Justice (1996), available at http://webarchive.nationalarchives.gov.uk/20060213223540/http://www.dca.gov.uk/civil/final/contents.htm. (accessed 17 August 2018); Jackson, Sir Rupert Review of Civil Litigation Costs (Norwich: TSO 2010)Google Scholar; Genn, H Judging Civil Justice (Cambridge: Cambridge University Press, 2009)CrossRefGoogle Scholar; Hodge, Jones and Allen (2014) Innovation in Law Report 2014, available at https://www.hja.net/wp-content/uploads/hja-innovation-in-law-report-2014.pdf (last accessed 27 May 2019); Civil Justice Council, Online Dispute Advisory Group Online Dispute Resolution for Law Value Civil Claims, available at https://www.judiciary.gov.uk/wp-content/uploads/2015/02/Online-Dispute-Resolution-Final-Web-Version1.pdf (last accessed 27 May 2019). For a critical voice see Transform Justice's Briefing on the Prisons and Courts Bill (2017), available at http://www.transformjustice.org.uk/wp-content/uploads/2017/03/Transform-Justice-Briefing-on-the-Prisons-Courts-Bill.pdf. For the EU see ADR Directive (Directive 2013/11/EU) and ODR Regulation (Regulation 524/2013) of 24 May 2013. For interesting developments in China see ‘Chinese judicial justice on the cloud: a future call or a Pandora's box? An analysis of the “intelligent court system” of China’ (2017) 26(1) Information & Communications Technology Law 59.

15 Ministry of Justice Transforming our Justice System: Summary of Reforms and Consultation (2016) pp 3–5.

16 See Marks, A What is a Court? (London: Justice, 2016)Google Scholar; Donoghue, JThe rise of digital justice: courtroom technology, public participation and access to justice’ (2017) 80(6) Modern Law Review 995CrossRefGoogle Scholar.

17 House of Commons Committee of Public Accounts Transforming Courts and Tribunals (HC 2017–19, 976); National Audit Office Early Progress in Transforming Courts and Tribunals (2018); European Commission for the Efficiency of Justice European Judicial Systems: Efficiency and Quality of Justice (2016) Council of Europe; Commission The 2017 EU Justice Scoreboard COM (2017) 167 final.

18 The Law Society ‘Technology and the Law Policy Commission – algorithms in the justice system’ (2018), available at http://www.lawsociety.org.uk/policy-campaigns/articles/public-policy-technology-and-law-commission/ (last accessed 27 May 2019).

19 See Lodder, A and Zeleznikow, J Enhanced Dispute Resolution through the use of Information Technology (Cambridge: Cambridge University Press, 2010)CrossRefGoogle Scholar.

20 Katsh, E and Rabinovich-Einy, O Digital Justice: Technology and the Internet of Disputes (Oxford: Oxford University Press, 2017) pp 3334CrossRefGoogle Scholar. Also, as J Zelenikow points out, such technology is particularly appropriate in view of the increase of self-represented litigants, See ‘Can artificial intelligence and online dispute resolution enhance efficiency and effectiveness in courts’ (2017) 8(2) International Journal for Court Administration 30.

21 See for example Wahab, M et al. (eds) Online Dispute Resolution: Theory and Practice: A Treatise on Technology and Dispute Resolution (The Hague: Eleven Publishing, 2011)Google Scholar; Civil Justice Council, Online Dispute Advisory Group, Online Dispute Resolution for Law Value Civil Claims (2015), available at https://www.judiciary.gov.uk/wp-content/uploads/2015/02/Online-Dispute-Resolution-Final-Web-Version1.pdf (last accessed 27 May 2019); and Duchateau, M et al. (eds) Evolution in Dispute Resolution: From Adjudication to ADR? (The Hague: Eleven Publishing, 2016)Google Scholar.

24 There are interesting experiments with crowdsourcing variations some of which use randomly selected volunteer jurors to adjudicate disputes (see Van Den Herik, J and Dimov, DTowards crowdsourced online dispute resolution’ (2012) 7 Journal of International Comparative Law and Technology 99Google Scholar.

25 Unitelkaar.nl has been developed as a successor to the pioneering Rechtwijzer system. See further https://uitelkaar.nl.

26 For information on the online divorce process see https://www.gov.uk/apply-for-divorce; for information on online claims, see https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/715684/MCOL_Userguide_for_Claimants_May_2018.pdf. Information on the Traffic Penalty Tribunal is available at https://www.trafficpenaltytribunal.gov.uk/; Beames, ETechnology-based legal document generation services and the regulation of legal practice in Australia’ (2017) 42(4) Alternative Law Journal 297CrossRefGoogle Scholar.

27 Ministry of Justice, see above n 15, pp 12, 19.

29 Susskind, above n 3, p 109.

30 Donoghue, JThe rise of digital justice: courtroom technology, public participation and access to technology’ (2017) 80(6) Modern Law Review 995CrossRefGoogle Scholar.

31 Above n 20.

32 As described by, for example, Gulliver, PNegotiations as a mode of dispute settlement: towards a general model’ (1973) 7(4) Law & Society Review 667CrossRefGoogle Scholar; Shapiro, M Courts: A Comparative and Political Analysis (Chicago: University of Chicago Press, 1986)Google Scholar; and Roberts, S and Palmer, M Disputes Processes: ADR and the Primary Forms of Decision-Making (Cambridge: Cambridge University Press, 2nd edn, 2005)CrossRefGoogle Scholar.

33 See further Roberts and Palmer, above n 32; Fuller, LMediation – its forms and functions’ (1970) 44 Southern California Law Review 305Google Scholar; Mustill, LordArbitration: history and background’ (1989) 6 Journal of International Arbitration 43Google Scholar; Fuller, L and Winston, KThe forms and limits of adjudication’ (1978) 92(2) Harvard Law Review 353CrossRefGoogle Scholar.

34 Ministry of Justice, above n 15.

35 J Tomlinson ‘The policy and politics of building tribunals for a digital age: how “design thinking” is shaping the future of the public law system’ UK Const Law Blog (21 July 2017), available at https://ukconstitutionallaw.org/ (last accessed 27 May 2019).

36 Indeed, a report by the Online Dispute Resolution Advisory Group states that while they envision AI carrying out various tasks in the future, such as legal diagnosis, facilitation of negotiation without direct human involvement, and acting as ‘intelligent assistants’ for judges, at no point is it proposed these same judges be replaced – meaning the final binding resolutions and decisions remain in human hands: Online Dispute Resolution for Low Value Civil Claims (Civil Justice Council, 2015), available at https://www.judiciary.uk/wp-content/uploads/2015/02/Online-Dispute-Resolution-Final-Web-Version1.pdf, pp 24–25.

37 For a useful general overview see Giuffrida, I et al. ‘A legal perspective on the trials and tribulations of AI: how artificial intelligence, the internet of things, smart contracts, and other technologies will affect the law’ (2018) 68 Case Western Reserve Law Review 747Google Scholar.

39 Shapiro, above n 32, p 1.

40 Lord Justice Briggs Civil Courts Structure Review: Final Report, at Judiciary of England and Wales (July 2016), available at https://www.judiciary.uk/wp-content/uploads/2016/07/civil-courts-structure-review-final-report-jul-16-final-1.pdf.

41 The requirement for office is stated at https://www.trafficpenaltytribunal.gov.uk/our-adjudicators/.

42 Hildebrandt, M Smart Technologies and the End(s) of Law (Cheltenham: Edward Elgar, 2015) p 22CrossRefGoogle Scholar.

43 See White House Fact Sheet: Launching the Data-Driven Justice Initiative: Disrupting the Cycle of Incarceration (2016), available at https://obamawhitehouse.archives.gov/the-press-office/2016/06/30/fact-sheet-launching-data-driven-justice-initiative-disrupting-cycle (last accessed 27 May 2019).

44 See E Holder ‘Attorney general Eric Holder speaks at the National Association of Criminal Defense Lawyers 57th Annual Meeting and 13th State Criminal Justice Network Conference: Remarks prepared for delivery’, available at https://www.justice.gov/opa/speech/attorney-general-eric-holder-speaks-national-association-criminal-defense-lawyers-57th; Cullen, FT et al. ‘Eight lessons from moneyball: the high cost of ignoring evidence-based corrections’ (2009) 4 Victims and Offenders 197CrossRefGoogle Scholar; Kehl, D et al. ‘Algorithms in the criminal justice system: assessing the use of risk assessments in sentencing’ (2017) Responsive Communities Initiative, Berkman Klein Center for Internet & Society, Harvard Law School 15Google Scholar.

45 M Beeman and A Wickman ‘The criminal justice coordinating council network mini-guide series: risk and needs assessment’ (Justice Management Institute, 2013) p 11; ‘Written evidence submitted by Durham Constabulary, Presented to House of Commons Science and Technology Committee Inquiry on Algorithms in decision making’ (2017), available at http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/science-and-technology-committee/algorithms-in-decisionmaking/written/69063.html.

46 Simon, JThe ideological effects of actuarial practices’ (1988) 22 Law & Society Review 771CrossRefGoogle Scholar; O'Malley, PRisk, power and crime prevention’ (2006) 21 Economy and Society 252CrossRefGoogle Scholar; O'Malley, PExperiments in risk and criminal justice’ (2008) 12 Theoretical Criminology 451CrossRefGoogle Scholar; Harcourt, B Against Prediction: Profiling, Policing and Punishing in an Actuarial Age (Chicago: University of Chicago, 2006)CrossRefGoogle Scholar.

47 Northpointe ‘Practitioner's Guide to COMPAS Core’ (2015), available at https://epic.org/algorithmic-transparency/crim-justice/EPIC-16-06-23-WI-FOIA-201600805-COMPASPractionerGuide.pdf; S Urwin ‘Algorithmic forecasting of offender dangerousness for police custody officers: an assessment of accuracy for the Durham Constabulary model’ (2016), research presented as for the purposes of gaining a Master's Degree in Applied Criminology and Police Management at Cambridge University, available at https://www.crim.cam.ac.uk/global/docs/theses/sheena-urwin-thesis-12-12-2016.pdf/at_download/file.

48 Urwin, above n 47, p 102.

49 Northpointe, above n 47.

50 Coletta, C and Kitchin, RAlgorhythmic governance: regulating the “heartbeat” of a city using the internet of things’ (2017) 4 Big Data and Society 1CrossRefGoogle Scholar.

51 Northpointe, above n 47, pp 22 and 49.

52 G Barnes and S Urwin ‘Written evidence submitted by Durham Constabulary, presented to House of Commons Science and Technology Committee Inquiry on algorithms in decision making’ (2017), available at http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/science-and-technology-committee/algorithms-in-decisionmaking/written/69063.html.

53 Ibid; Durham Constabulary, above n 45.

54 J Angwin et al ‘Machine bias: there's software used across the country to predict future criminals. And it's biased against blacks’ (2016), available at https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing; W Dieterich et al ‘Compas risk scales: demonstrating accuracy equity and predictive parity’ (2016), available at http://go.volarisgroup.com/rs/430-MBX989/images/ProPublica_Commentary_Final_070616.pdf; Northpointe, above n 47.

55 R Berk et al ‘Fairness in criminal justice risk assessments: The state of the art’ (2017), available at https://arxiv.org/abs/1703.09207, pp 18, 29; J Kleinberg et al ‘Inherent trade-offs in the fair determination of risk scores’ (2017), available at https://arxiv.org/pdf/1609.05807.pdf.

56 Urwin, above n 47, pp 71–73.

57 Kleinberg, J et al. ‘Human decisions and machine predictions’ (2018) 133 The Quarterly Journal of Economics 237Google ScholarPubMed.

58 See for example H Prakken's work on the specific use of Bayesian analyses of complex criminal cases as argumentation support in deciding about the probability of guilt given the available evidence. ‘A new use case for argumentation support tools: supporting discussions of Bayesian analyses of complex criminal cases’ (2018) Artif Intell Law 1.

59 Susskind, above n 3, p 121.

60 Alarie, A et al. ‘How artificial intelligence will affect the practice of law’ (2018) 68(1) University of Toronto Law Journal 108CrossRefGoogle Scholar. Cf Cass Sunstein in Sunstein, C et al. ‘Symposium: legal reasoning and artificial intelligence: how computers think like lawyers’ (2001) 8 University of Chicago Law School Roundtable p 19 ffGoogle Scholar.

61 Ashley, above n 6. See also K Atkinson et al ‘Towards artificial argumentation’ (2017) 38(3) AI Magazine 25, available at http://discovery.ucl.ac.uk/10026121/1/aimag17.pdf.

62 See for example Ross Intelligence, which is a cloud-based question and answer format service developed from IBM Watson, accepting plain English questions and offering answers based on legislation, case law and other sources. See https://rossintelligence.com (accessed 27 May 2019). See also the discussion in Rosenfeld, A and Kraus, S Predicting Human Decision-Making: From Prediction to Action (San Rafael: Morgan and Claypole, 2018)Google Scholar.

63 Ashley, above n 6, p 31.

64 See Susskind, R The End of Lawyers?: Rethinking the Nature of Legal Services (Oxford: Oxford University Press, 2010)Google Scholar.

65 R Cranston, ‘What do Courts Do?’ (1986) 5 Civil Justice Quarterly 123 fn 2.

66 Above n 33, at 368. See also Dan-Cohen, DBureaucratic organizations and the theory of adjudication’ (1985) 85 Columbia Law Review 1CrossRefGoogle Scholar; and Harlow, C and Rawlings, RProceduralism and automation: challenges to the values of administrative law’ in [2017] UKSC 51Google Scholar. Fisher, E et al. (eds) The Foundations and Future of Public Law (in honour of Paul Craig) (Oxford: Oxford University Press, 2019)Google Scholar.

67 See for example Samuel, G The Foundations of Legal Reasoning (Antwerp Maklu Uitgevers, Blackstone, 1994Google Scholar; Alexander, L and Sherwin, E Demystifying Legal Reasoning (Cambridge: Cambridge University Press, 2008)CrossRefGoogle Scholar or the approach offered by Alexy, R A Theory of Legal Argumentation: The Theory of Rational Discourse as Theory of Legal Justification (Oxford: Oxford University Press, 2009)Google Scholar or the overview of such approaches provided by Feteris, E Fundamentals of Legal Argumentation: A Survey of Theories of Justification of Judicial Decisions Argumentation Library Vol 1 (Netherlands: Kluwer, 2nd edn, 2017)CrossRefGoogle Scholar.

68 See for example Kahneman, D et al. Judgment under Uncertainty: Heuristics and Biases (Cambridge: Cambridge University Press, 1982)CrossRefGoogle Scholar for an account within this general approach of the complexity of the judging process which stands very much in contrast to many of the AI theorists’ more simplistic understandings.

69 See the critical account of this offered by Leith, P in Formalism in AI and Computer Science (London: Wiley, 1987)Google Scholar.

70 See for example Bench-Capon, T and Dunne, PArgumentation in artificial intelligence’ (2007) 171 Artificial Intelligence 619CrossRefGoogle Scholar; Simari, G and Rahwan, I (eds) Argumentation in Artificial Intelligence (Boston, MA: Springer, 2009)CrossRefGoogle Scholar; the continuing work of the International Association for Artificial Intelligence and Law at http://www.iaail.org (last accessed 27 May 2019) and the journal Artificial Intelligence and Law from the University of Pittsburgh, published by Springer which now runs to 26 volumes.

71 See for example Atkinson, K and Bench-Capon, TTaking account of the actions of others in value-based reasoning’ (2018) 254 Artificial Intelligence 1CrossRefGoogle Scholar; Bench-Capon, TPersuasion in practical argument using value-based argumentation frameworks’ (2003) 13 Journal of Logic and Computation 429CrossRefGoogle Scholar; and Bench-Capon, T and Modgil, SNorms and value based reasoning: justifying compliance and violation’ (2017) 25 Artificial Intelligence and Law 29CrossRefGoogle Scholar.

72 It is noteworthy that there is not a huge amount of socio-legal work on judges and their everyday activities. Much of what is known about the judiciary is focused on the USA. See Meveety, N (ed) The Pioneers of Judicial Behavior (Ann Arbor: University of Michigan Press, 2002)CrossRefGoogle Scholar or, from a different, insider perspective, Posner, R How Judges Think (Cambridge, Mass: Harvard University Press, 2008)Google Scholar. In the UK the socio-legal focus has been mainly on the most senior courts. See eg Patterson, A Final Judgment: The Last Law Lords and the Supreme Court (Oxford: Hart Publishing, 2013)Google Scholar; or even on particular aspects of their work: eg Dickson, B Human Rights and the United Kingdom Supreme Court (Oxford: Oxford University Press, 2013)CrossRefGoogle Scholar; or Gee, G et al. The Politics of Judicial Independence in the UK's Changing Constitution (Cambridge: Cambridge University Press, 2015)CrossRefGoogle Scholar. There is more limited work on everyday role of the judge. For a relatively rare example see Darbyshire, P Sitting in Judgment: The Working Lives of Judges (Oxford: Hart Publishing, 2011)Google Scholar; and Thomas, C and Genn, H Understanding Tribunal Decision-Making (London: Nuffield, 2013)Google Scholar.

73 See further Cahill-O'Callaghan, RReframing the judicial diversity debate: personal values and tacit diversity’ (2013) 35 Legal Studies 1CrossRefGoogle Scholar, and Gee, G and Rackley, E (eds) Debating Judicial Appointments in an Age of Diversity (Abingdon: Routledge, 2018)Google Scholar.

74 This idea of the proper place for participation, and how it should be realised, is considered in a series of cases looking at the adequacy of government public consultation procedures where the courts have looked at how different process elements drawn from court procedures – from a right to a hearing, to rights to know reasons for a decision, have time for consideration and response etc – relate to fairness, and indeed to wider issues of democracy and dignity. See further Morison, JCitizen participation: a critical look at the democratic adequacy of government consultations’ (2017) 37 Oxford Journal of Legal Studies 636CrossRefGoogle Scholar.

75 R Simmons ‘Big data, machine judges, and the legitimacy of the criminal justice system’ (2018) Ohio State Legal Studies Working Paper No 442, available at https://ssrn.com/abstract=3156510.

76 See further Fuchs, DThe dangers of human-like bias in machine-learning algorithms’ (2018) 2(1) Missouri S&T's Peer to PeerGoogle Scholar, available at http://scholarsmine.mst.edu/peer2peer/vol2/iss1/1.

77 See further Shapiro, MThe giving reasons requirement’ (1992) U Chi Legal F 197Google Scholar; Schauer, IGiving reasons’ (1994) 47 Stanford Law Review 633CrossRefGoogle Scholar.

78 On issues of transparency generally in AI, and the regulatory challenges that this throws up, see C Reed ‘How should we regulate AI?’ (2008) Phil Trans R Soc A 376 and House of Lords Select Committee on Artificial Intelligence 2018 report AI in the UK: Ready, Willing and Able? Report of Session 2017–19 (published 16 April 2017) HL Paper 100, available at https://publications.parliament.uk/pa/ld201719/ldselect/ldai/100/100.pdf.

79 See further Sourdin, T and Cornes, RDo judges need to be human? The implications of technology for responsive judging’ in Sourdin, T and Zariski, A (eds) The Responsive Judge. Ius Gentium: Comparative Perspectives on Law and Justice, Vol 67 (Singapore: Springer, 2018)CrossRefGoogle Scholar and T Sourdin ‘Judge v robot? Artificial intelligence and judicial decision making’ (2018 forthcoming) U New South Wales Law J 41(4).

80 Summers, REvaluating and improving legal processes – a plea for process values’ (1974) 60 Cornell Law Review 3Google Scholar.

81 R (on the application of Unison) v Lord Chancellor [2017] UKSC 51.

82 Morison, J and Leith, P The Barrister's World and the Nature of Law (Buckingham: Open University Press, 1992)Google Scholar. See also Leith, P and Morison, JCan jurisprudence without empiricism ever be a science?’ in Coyle, S and Pavlakos, G (eds) Jurisprudence or Legal Science? (Oxford: Hart Publishing, 2005) pp 147168Google Scholar.

83 Morison, JWhat makes an important case? An agenda for research’ (2012) 12(4) Legal Information Management 251CrossRefGoogle Scholar.

84 Ross, above n 8, p 106.

85 Durham Constabulary, above n 11, p 79.

86 McLuhan, M Understanding Media: The Extension of Man (New York: McGraw Hill, 1964) p 223Google Scholar.

87 Hunt and Wickham, above n 11; Golder and Fitzpatrick, above n 11, p 2.

88 Durham Constabulary above n 45.

89 Aletras, N et al. ‘Predicting judicial decisions of the European Court of Human Rights: a natural language processing perspective’ (2016) Peer J Comput Sci, DOI 10.7717/peerj-cs.93CrossRefGoogle Scholar; Katz, DM et al. ‘A general approach for predicting the behaviour of the Supreme Court of the United States’ (2017) 12(4) PLoSONE 1CrossRefGoogle ScholarPubMed.

90 While a speculative prospect, triage in this sense would refer to a situation where applications to the court are assessed algorithmically based on previous jurisprudence, in order to determine the likely outcome of the case, and are therefore sorted appropriately (reject or accept) prior to human examination: see https://www.legaltechdesign.com/LegalDesignToolbox/product-typology/triage/ (last accessed 27 May 2019). Alternatively, it could allow applicants to submit application details for analysis and be provided automatically with advice on next steps and likely outcomes. For further information on similar existing technologies, see information on Joshua Browder's DoNotPay app at J Porter ‘Robot lawyer donotpay now lets you ‘sue anyone’ via an app’ (2018) The Verge, available at https://www.theverge.com/2018/10/10/17959874/donotpay-do-not-pay-robot-lawyer-ios-app-joshua-browder (last accessed 27 May 2019).

91 For a fuller account of this see L Edwards and M Veale ‘Slave to the algorithm? Why a “right to an explanation” is probably not the remedy you are looking for’ (2017) 16 Duke Law & Technology Review 18; Data Protection Act 2018, s 14; Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (GDPR), Art 22.

92 GDPR, Art 22(3). See also Art 9(2)(e) and recitals 20 and 52.

93 See further Y Mehozay and E Fisher ‘The epistemology of algorithmic risk assessment and the path towards a non-penology penology’ (2018) Punishment & Society (https://doi.org/10.1177/1462474518802336).

94 Garland, D“Governmentality” and the problem of crime’ (1997) 1 Foucault, Criminology, Sociology 173Google Scholar at 181; Matzner, TOpening black boxes is not enough – data-based surveillance in discipline and punish today’ (2017) 23 Foucault Studies 27CrossRefGoogle Scholar at 31; Fergus McNeill ‘Mass supervision, misrecognition and the “malopticon”’ (2018) Punishment & Society (https://doi.org/10.1177/1462474518755137).

95 Ibid; Elden, SPlague, panopticon, police’ (2003) 3 Surveillance and Society 240Google Scholar.

96 Feeley, MM and Simon, JThe new penology: notes on the emerging strategy of corrections and its implications’ (1992) 30 Criminology 449CrossRefGoogle Scholar at 451.

97 Krasmann, SImagining Foucault: on the digital subject and “visual citizenship”’ (2017) 23 Foucault Studies 10CrossRefGoogle Scholar at 18.

98 Northpointe, see above n 47, pp 22, 49; Durham Constabulary, see above, n 45.

99 R Binns et al ‘It's reducing a human being to a percentage: perceptions of justice in algorithmic decisions’ (2018), available at https://arxiv.org/abs/1801.10408.

100 Golder and Fitzpatrick, above n 11.

101 Golder and Fitzpatrick, above n 11.