Hostname: page-component-7bb8b95d7b-nptnm Total loading time: 0 Render date: 2024-09-30T01:31:10.731Z Has data issue: false hasContentIssue false

Stephanie Hare

Published online by Cambridge University Press:  11 August 2023

Rights & Permissions [Opens in a new window]

Abstract

Author, broadcaster and researcher Stephanie Hare's presentation on technology ethics was one of the most spoken about events at last year's BIALL conference, a thought-provoking talk that left many wanting more. With this in mind, Mike and Jas Breslin sat down with Stephanie to talk about the fascinating and very readable book it was based on, her thoughts on why technology ethics is especially important to information professionals, and whether she believed AI will ever do law librarians out of a job.

Type
The Interview
Copyright
Copyright © The Author(s), 2023. Published by British and Irish Association of Law Librarians

Stephanie Hare

It's quite a funny thought, but it probably never occurred to the Wright brothers that in inventing powered flight they also inadvertently invented plane crashes. That's a joke, of course, yet within it there is the germ of a very serious and quite profound idea, especially in this rapidly evolving digital age, which centres around this one question: “How can we create and use technology to maximise benefits and minimise harms?”

The quote above is from Technology Is Not Neutral: A Short Guide to Technology Ethics, by Stephanie Hare, who gave a very well received presentation at last year's BIALL Conference that was based on this critically acclaimed and very readable book. Stephanie is a researcher, broadcaster and author from the US who now lives in the UK. She has a background as an historian, but for many years she has extensively researched this relatively new discipline of technology ethics.

But just what is technology ethics? “Technology ethics is an interdisciplinary pursuit,” Stephanie says. “You can do technology without doing ethics, and you can do ethics without ever being interested in technology. And those are both fascinating fields. Magic happens, in my humble opinion, when you do them together, because you're drawing on two very different disciplines, and then you're creating something new.

“Technology ethics concerns all of us, but we are often not sure how to do it or how to know if what we are doing is working,” Stephanie adds. “It's so relevant to everyone on the planet, no matter what they're doing, where they're doing it, if they're creating it, using it, investing in it, banning it, or regulating it. If you're a cop, if you're an activist, if you're a journalist, teacher, student …”

Technology ethics is also incredibly dynamic, says Stephanie: “It's a series of questions and answers that never really goes away because technology is always changing. Also, humans’ ideas about values, such as ‘What do we think is right or wrong, good or bad? What is neutral?’ are always changing, too, both over time, but also depending on where you are. And then individuals are going to evolve over time, too. So, what you think about a tool or a technology might look really weird 10 years from now.”

But Stephanie warns that this is not the discipline for those who are after easy answers. “If you're the kind of person who thinks, ‘I just want the silver bullet and I don't want to think about it’, then technology ethics is not for you,” she says. “If you want something that is adaptable and flexible through which you can draw on and be enriched by thousands of years of scholarship, then it's great, because if you're ever stuck, you have two of the world's deepest disciplines to draw on – philosophy and technology. I get inspired by it every day. I love it.”

Stephanie Hare's critically acclaimed book is an ideal introduction to the field of technology ethics – a discipline that should be of interest to all information professionals

I-SPY

The main thrust of Stephanie's book, as the title suggests, is the neutrality of technology. Is technology just a tool, or do its creators and users have a responsibility when it comes to its use and the consequences of its use? This was, she says, something that she first began to think deeply about in 2013, when classified information from the US Government's National Security Agency (NSA) was leaked. “My personal consciousness raising moment was Edward Snowden's revelations about the role of US companies in helping the US government spy, in order to keep the country safe, allegedly. Not just on non-Americans, but on Americans, and because I'm from the United States, that felt to me like a line had been crossed … Maybe I was naive, but I didn't realise how much companies were complicit in the US security apparatus until these revelations.

“In retrospect, I guess I was aware of tapping for law enforcement, but it never occurred to me that Yahoo! or Google or Facebook were going to be tapped, and also that their executives were going to be threatened by the US government,” Stephanie adds. “That was the other thing, I didn't realise that they had been basically told, ‘You have to go along with this, and if you even speak about it publicly, much less refuse, we can throw the book at you’. So, if you're just running a company, the US government could come up and do this to you and strongarm you, and you have few rights as an executive and as a citizen, to say, ‘I don't want my company to go along with this’.

“Most people, they start a company and they're doing it because they want to sell a product or service, and the next thing you know, you actually need an entire policy and a legal team to help you understand your obligations and what you would have to communicate to your customers, to say things like ‘You're posting pictures about your family on social media and you're building a social graph, but what you need to know is we're going to use this data and we're going to traffic it with third party advertisers and researchers, but we're also going to be sharing it with the government’. And again, a lot of people reply, ‘Nothing to hide, nothing to fear’, but we have learned how absolutely bogus that statement is, right? So that was my awakening as an American, just seeing that.”

Because so many of the technology companies whose products and services are used around the world are based in the US this is certainly not just an American issue, either. “All these non-Americans are being sucked into this as well,” Stephanie says. She explored this in an article that became a case study for the Harvard Business Review. Footnote 1 But rather than putting the subject to rest, it only made her more curious. “I got really into thinking about civil liberties and human rights, as well as what businesses like to think, which is, ‘We don't do politics, we're just here to make money’. I cannot tell you the amount of people that I've had to hear that from, the conversations that we've had to have. I developed my arguments that are now in my book by having those arguments with my colleagues, where I was saying, ‘No, you're doing politics every day. Even saying ‘I'm neutral, I'm not getting involved in that’, is a political statement.”

There is not the space to go into all the examples of how technology has been shown not to be so neutral, but for a good example one of the two big case studies Stephanie conducts in the book is the use of facial recognition in security and policing, and especially its poor track record of misidentifying people with darker skin colours. In addition to causing unnecessary distress, this can also lead to wrongful arrest and indeed has already done so several times in the United States. Clearly, a situation where the use of the technology is harmful.

Even the person making the chip for a circuit board that goes into a device should think about the ethical implications of the technology, says Stephanie Hare

INDIVIDUAL RESPONSIBILITY

It's the part each and every one of us can play in this that those who work with technology really need to read and understand, though. As Stephanie explains, the person making the chip that goes inside a device also needs to think about his or her role in what that device is ultimately used for. And while the law librarian will need to be aware of this, there is the added aspect that they will need to be on top of all the new regulation that is coming down the line, too, and the many implications of it.

“Lawyers will be critical in terms of our understanding of human rights and civil liberties, but also the responsibilities of any data controller or processor,” Stephanie says “We've got all these laws coming in. The European Union has the Digital Services Act and the Digital Markets Act. We've got the EU AI Act coming soon, and then there's all the EU-US data exchanges. There's the fact that now that we've Brexited, we've got our own ‘third nation’ situation that makes it complicated. We're going to need lawyers more than ever to help everybody understand this new world that we are building, and the responsibilities of it, and how data moves around the world, and how to protect it. That is key, as is understanding our rights. There's been lots of discussion about, ‘Do we need a new Digital Bill of Rights?’ But what does that even mean? So, if you're somebody who thinks that your data has been violated, who would you pick up the phone to call? It's probably going to be a lawyer.”

Which is where law librarians come in. “You all – librarians – are at the absolute cutting edge,” Stephanie says. “You're the gatekeepers of knowledge. So you'll need to be: ‘Here are the resources that everybody needs; this is the most recent scholarship. If you have problems, look here, look here, look here’. And I wanted to do some of that heavy lifting for you. So, if you go to the bibliography in the book, it's one short page and it says, ‘If you want something more complete, go to my website.’ On my website is the link that takes you to the 378-page online full bibliography.Footnote 2 I've got a decent grasp, not perfect, but a decent grasp on some trends. We can't know them all, so it's about knowing where to go if we don't know. If I can help librarians have that access, that's fantastic. So that resource is there, if you want it.”

Ethics, of course, is not just about knowledge, it's also about action. It's not just a matter of understanding that something is wrong, it's knowing what to do about it, which is where this can get tricky. So how might all this work on a practical level? “So, if you want to launch a website, for example, and you're thinking: ‘What are my diversity, equity, inclusion and accessibility responsibilities?’ Well, luckily, the British government is amazing on that. And so in the very detailed bibliography on my website [are links to] all of the design thinking principles that you would follow to make sure that anything you're going to put out in the public is accessible to all people. It sounds like such a no-brainer, but it's not. It's really not. And educating everybody within your organisation about accessibility in your technology weirdly has a ripple effect into your analogue, physical, real-world thinking, too. There's a really beautiful symbiosis that starts to happen when you think about ethics, where you're thinking, ‘Okay, I'm fixing this one thing and it's making me notice that over there is a bit of a mess. How do we think about the ethics of that?’”

PHILOSOPHER'S TONE

Throughout the book Stephanie uses the toolkit of philosophy – through six components of that discipline: metaphysics, epistemology, logic, political philosophy, aesthetics and ethics – to help get to grip with the issues at hand. A good example of this is asking what the purpose of someone's work is in the first place, and how that helps us to frame our responsibilities. “If you wanted to go to the metaphysics part, what is reality? What is our purpose here? What are we actually trying to do? And what is in scope and what is out of scope?,” she says. “Getting really clear on your data or responsibilities and purpose; that's the first step.”

To this end Stephanie includes a useful list of questions in the book that you can ask yourself that she has compiled over 25 years of working in technology and problem solving (

Figure 1

). Yet while there are ways of checking you're on the right path, with life in general and technology in particular, things change, which means it's important to always re-evaluate the different aspects of the checklist.

Figure 1

“What keeps me up at night, weirdly, is not the stuff that I already know is a mess or a problem,” Stephanie says. “I'm aware of that. That's on my radar. The thing that worries me the most in my work is the stuff that I know is going to be out there, but I don't know what it is yet, and I don't know how to find it.”

Covid is a good, if extreme, example of this. “There were all these people warning about a pandemic before it happened. And most governments treated it as a low-probability, high-impact event. Now we would say it was a high-probability, high-impact event. It was always going to happen. It was just a question of when.”

SPREADING THE WORD

Of course, while you might see the ethical implications of, for example, adopting a new technology, those within your team or, perhaps more importantly, others within the organisation or firm, might not believe it's relevant. They might take a shorter term view, or they might argue that technology is, in fact, neutral and nothing to do with the user. So how do you get the message across, how do you get other people to change?

“It's culture specific, so you have to know your audience, what motivates people,” Stephanie says. “If you're in an organisation that's really obsessed with risk, you might want to go in with, ‘This is a risk, what happens if you don't do ethics?’ The bare minimum is, what are you doing to keep the law off your back, not get in trouble with the tax authorities, or whatever. Then we can go further. Everybody's got a mission statement. Why are we all here? In theory, everybody's applied to work there for a reason and it's part of a mission, and you have success metrics, and you can measure against them. So you can build this right into your pre-existing culture of values, your framework, your performance goals, your KPIs, etc. That's one way. Or you can do doom and fear: if we don't do this, then the competition is going to eat our lunch, or the law's going to be on our back.”

It is certainly not all about negative implications, though. “You might say, if we do this, we could create a new market, or we could capture more market share, because we have spotted this opportunity that very few other people are doing, or doing well. We think we could do it better. Let's go after it. I personally like to be in that space because I like practical optimism. Let's go after what's good. You can choose to always view the world through a doom and risk lens …[and] you have to protect your soft underbelly, of course, but you also want to be growing, and ethics is a growth space. There are more and more people who are interested in values, who want to know that what they're buying or participating in, in any way, shape or form, is aligned with how they want to live. I want to feel good about this hat that I'm buying, or this food that I'm eating, or these partners that I'm working with, these clients that I have. So I think there are a couple of levers that you can pull. But there's also: what kind of world do you want to build?”

Embracing technology ethics can also be empowering, argues Stephanie. “I write in the book that saying you're not interested in technology ethics makes you a cog in someone else's machine, because technology ethics is happening all around you. Are you going to be a pawn on somebody else's chess board? Or would you rather be moving those pieces and building the board yourself? And I know the position I personally would rather be in. Why would you give up your own agency and power when you don't have to? You can have a lot more power than you might think, so it's ultimately a very optimistic book, I hope. I think the challenge that we all face is that, even with the best of intentions, we can still build things that can have negative consequences.”

ARTIFICIAL INTELLIGENCE

AI is perhaps the one area where people regularly engage in technology ethics, if only through science fiction, and it is discussed at length in the book. One of the biggest fears with artificial intelligence is that machines will take jobs now done by people, even librarians. But Stephanie doubts that this will be the case. “I personally don't think that it's ever going to be a case of replacing humans,” she says. “No doubt our robot overlords will correct me on this in 2050! But I think it's always going to be humans working with technology in lockstep.

“I'm optimistic in terms of AI. I think for librarians and knowledge creators and curators, it will be your awesome ‘I will scream if you take it away from me’ tool that frees you up to do even higher value tasks. Your jobs might change and evolve, but I don't think they're going to go away, or that you'll be redundant. The more sophisticated our knowledge is becoming, the more we'll need human curation.”

There is also the added value that talking to a librarian brings. “Particularly where there's still a physical repository of knowledge. No matter how much it's been digitised, you have a feel for your own collection. So, people working with archives might say: ‘We've got some boxes over there that are not tagged with what you're looking for, but I think they are possibly still worth looking at’. And that often works. We don't always know what we're looking for. Sometimes it's like the lucky dip, and librarians have guided so much of how I think about knowledge … They'll remember something, or they'll call their colleague over, and there's institutional knowledge of how the collection's evolved. A machine can't do that. It just can't do that. And so that's okay too. That's what I mean: I think there's going to be room for both.”

References

Footnotes

1 Hare, Stephanie, ‘For Your Eyes Only: U.S. Technology Companies, Sovereign States, and the Battle Over Data ProtectionBusiness Horizons, Volume 59, Issue 5, September–October 2016, 549-561CrossRefGoogle Scholar.

2 <www.harebrain.co/books> accessed 8 May 2023.

Figure 0

Figure 1