Book contents
- Frontmatter
- Dedication
- Contents
- About the Author
- Acknowledgements
- One Introduction: Trust Issues
- Two Trustification: Extracting Legitimacy
- Three State: Measuring Authority
- Four Corporate: Managing Risk
- Five Research: Setting Terms
- Six Media: Telling Stories
- Seven Case Study: COVID-19 Tracing Apps
- Eight Case Study: Tech for Good
- Nine Case Study: Trusting Faces
- Ten Conclusion: False Trade-Offs
- References
- Index
Four - Corporate: Managing Risk
Published online by Cambridge University Press: 24 January 2024
- Frontmatter
- Dedication
- Contents
- About the Author
- Acknowledgements
- One Introduction: Trust Issues
- Two Trustification: Extracting Legitimacy
- Three State: Measuring Authority
- Four Corporate: Managing Risk
- Five Research: Setting Terms
- Six Media: Telling Stories
- Seven Case Study: COVID-19 Tracing Apps
- Eight Case Study: Tech for Good
- Nine Case Study: Trusting Faces
- Ten Conclusion: False Trade-Offs
- References
- Index
Summary
Here, I build on the previous chapter's discussion of trustification under state logics of power by focusing on corporate power. The companies that design and manage technologies wield inordinate power, with the ability to embed certain values and priorities. This ‘governance-by-design’ (Mulligan and Bamberger, 2018) risks bypassing state and public forms of governance through the ways technologies are developed to shape increasing areas of everyday life. In order to justify this power, and to keep it in the hands of tech companies, trustification is used in ways different from state logics, instead based in corporate narratives and logics of power.
Corporations engage in games of legitimacy against states and societies. Companies and states perform certain roles in relation to one another and constitute their power within such relations: as regulator; as provider of essential infrastructure or services; as investor; as problem to be dealt with or avoided. To accomplish this, each must learn to see (and speak) as the other. As Félix Tréguer (2019: 148–9) has identified, there are different constraints affecting a company's focus on resistance or cooperation with state surveillance, particularly following the Snowden mass surveillance leaks. These constraints include high concern for user trust and competition weighting more towards resistance, while holding (or seeking) government contracts weights more towards cooperation.
The games come down, in part, to where a corporation seeks to extract legitimacy, and managing potential risks. For example, pursuing a law-enforcement contract for facial recognition algorithms may make Google feel more inclined to share people's search data with that same agency. The straight extraction of data from individuals undermines the idea of any kind of transaction between corporations and their users, consumers or workers (Lyon, 2019: 67). Instead, data is exchanged between corporations (and states): games played at levels of power far beyond the populations the data is about.
The imposition of corporate discourses over several decades has embedded economic risk-based metrics into state thinking. The state also sees like big tech; the audience to corporate performances turns into a performer of those same discourses. For example, state capitalism ‘now appears to deploy its authority less to consolidate its political rule than to reduce its market risk’ (Gonzaga, 2021: 448).
- Type
- Chapter
- Information
- Mistrust IssuesHow Technology Discourses Quantify, Extract and Legitimize Inequalities, pp. 55 - 72Publisher: Bristol University PressPrint publication year: 2023