Book contents
- Frontmatter
- Dedication
- Contents
- About the Author
- Acknowledgements
- One Introduction: Trust Issues
- Two Trustification: Extracting Legitimacy
- Three State: Measuring Authority
- Four Corporate: Managing Risk
- Five Research: Setting Terms
- Six Media: Telling Stories
- Seven Case Study: COVID-19 Tracing Apps
- Eight Case Study: Tech for Good
- Nine Case Study: Trusting Faces
- Ten Conclusion: False Trade-Offs
- References
- Index
Two - Trustification: Extracting Legitimacy
Published online by Cambridge University Press: 24 January 2024
- Frontmatter
- Dedication
- Contents
- About the Author
- Acknowledgements
- One Introduction: Trust Issues
- Two Trustification: Extracting Legitimacy
- Three State: Measuring Authority
- Four Corporate: Managing Risk
- Five Research: Setting Terms
- Six Media: Telling Stories
- Seven Case Study: COVID-19 Tracing Apps
- Eight Case Study: Tech for Good
- Nine Case Study: Trusting Faces
- Ten Conclusion: False Trade-Offs
- References
- Index
Summary
In this chapter, I outline the conceptual framework of trustification, a way of describing how legitimacy is extracted from populations through processes of quantification, creating the conditions of trust without the underlying social relations of trustworthiness. This is set out from the relation between technology and trust, through the processes at work, to the role of proxy variables that stand in for trust in legitimizing technology discourses. The socially constructed drive to technology, particularly the drive to AI or the drive to data, does not accept the interplay of trust and mistrust. It does not accept any challenge to objectivity narratives, any hint that problems are not ‘solvable’ (in principle or through technology). Such ambiguity does not fit within the discourses that surround the power and legitimacy of technology and those who design, build, sell or use it. Such a challenge to the absolute faith in technology as a force for good for individuals and society is not permitted by those discourses.
A nuanced, political, socially engaged, fluid approach to trust and mistrust is simply outside mainstream technology discourses because it challenges the power structures and inequalities that such narratives sustain. These narratives no longer seek to build trust, but to create the impression of it, to perform the conditions of it. Or, more bluntly, to extract it. This process of performative extraction between quantification and discursive power I call trustification.
Trusting technology
People are asked every day to trust technology. This is in part trusting it to function as intended, which already involves not only expectations of operational accuracy (trust in the actions of the technology) but also expectations of intent (trust in both the intentions and actions of the designers or users of the technology). When we situate this operational form of trust across the sociotechnical assemblage that constitutes a given technology, we acknowledge other forms of trust as well. We are not only being asked to trust specific instances of specific technologies. We are not only being asked to trust the sprawling network of constituent technologies, materials and processes that make a specific technology. We are also being asked to trust in ‘technology’ as a broader concept and discourse, an organizing principle of society.
- Type
- Chapter
- Information
- Mistrust IssuesHow Technology Discourses Quantify, Extract and Legitimize Inequalities, pp. 14 - 36Publisher: Bristol University PressPrint publication year: 2023