Book contents
- Frontmatter
- Dedication
- Contents
- About the Author
- Acknowledgements
- One Introduction: Trust Issues
- Two Trustification: Extracting Legitimacy
- Three State: Measuring Authority
- Four Corporate: Managing Risk
- Five Research: Setting Terms
- Six Media: Telling Stories
- Seven Case Study: COVID-19 Tracing Apps
- Eight Case Study: Tech for Good
- Nine Case Study: Trusting Faces
- Ten Conclusion: False Trade-Offs
- References
- Index
One - Introduction: Trust Issues
Published online by Cambridge University Press: 24 January 2024
- Frontmatter
- Dedication
- Contents
- About the Author
- Acknowledgements
- One Introduction: Trust Issues
- Two Trustification: Extracting Legitimacy
- Three State: Measuring Authority
- Four Corporate: Managing Risk
- Five Research: Setting Terms
- Six Media: Telling Stories
- Seven Case Study: COVID-19 Tracing Apps
- Eight Case Study: Tech for Good
- Nine Case Study: Trusting Faces
- Ten Conclusion: False Trade-Offs
- References
- Index
Summary
Technology suffers serious trust issues. Between citizens and governments, users and platforms, governments and tech companies, the media and everyone, mistrust is rife. With good reason. And yet, we still act as if we trust technologies, the people who make them, and the people who use them on us. Every day we still use untrustworthy technologies designed by untrustworthy companies or have them used on us by untrustworthy organizations. We act as if we trust, even when we do not. We are expected to perform the conditions of trust even as trust crumbles with every new leak or revelation about the problems caused by unjust uses of technology. The problem has become not only a lack of trust, but that the prevailing discourses and power structures that surround technology have defanged the potential of mistrust to create change. How is it that the conditions of trust are created even in its absence? How is it that the legitimacy that comes from trust is extracted from people and populations?
The ways we talk about technology emphasize the need for trust. This might be to encourage the adoption of new technologies, to justify expanding the use of existing technologies, to support further research, or to back up (or counter) regulation. This book details how trust is quantified and extracted from populations. This process – which I call trustification – works in a similar way to how consent is extracted from individuals when they click ‘accept’ on a cookie popup notification. It is a sleight of hand that legitimizes not only specific uses of technology but the wider set of power relations in which they are developed and deployed, often against minoritized groups. The stakes of trustification are high, extracting legitimacy across local and global scales, and exacerbating existing inequalities in the process.
I do not use trustification in the financial–legal sense, but in relation to issues of quantification, datafication and the other processes of technologizing society. By operationalizing trust, converting it into a metric to be achieved, trustification extracts legitimacy from people to further the aims of using technology to manage society. But trustification is also about discourses, the stories we tell about technology and the way trust is used by and for power. These stories themselves legitimize the framing of trust as a quantifiable and solvable problem.
- Type
- Chapter
- Information
- Mistrust IssuesHow Technology Discourses Quantify, Extract and Legitimize Inequalities, pp. 1 - 13Publisher: Bristol University PressPrint publication year: 2023