10.1 Introduction
It’s accually obsene what you can find out about a person on the internet.Footnote 1
To some, this typo-ridden remark might sound banal. We know that our data drifts around online, with digital flotsam and jetsam washing up sporadically on different websites across the internet. Surveillance has been so normalized that, these days, many people aren’t distressed when their information appears in a Google search, even if they sometimes fret about their privacy in other settings.
But this remark is not a throwaway line by a disgruntled netizen. No. It’s a boast by a stalker, Liam Youens, who went online to find his victim, Amy Boyer. Youens traced Boyer after buying her work address from a data broker – a company that traffics information about people for profit. Youens documented his search for Boyer’s whereabouts on his personal website: “I found an internet site to do that, and to my surprize everything else under the Sun. Most importantly: her current employment.”Footnote 2 After he asked the broker for more information, he just had to bide his time. “I’m waiting for the results,” he wrote ominously, not long before shooting Boyer dead at work.Footnote 3
Data brokers fuel abuse by sharing people’s information and thwarting their obscurity. The value of obscurity, though sometimes overlooked in privacy discourse, rests on the idea that “information is safe – at least to some degree – when it is hard to obtain or understand.”Footnote 4 Brokers hinder obscurity by making it easier and likelier to find or fathom information about people. This act of foiling obscurity, in turn, facilitates interpersonal abuse. The physical violence suffered by Amy Boyer is but one kind of abuse; people also face stalking, harassment, doxing, defamation, fraud, sextortion, and nonconsensual sharing of their intimate images.Footnote 5
This chapter explores the phenomenon of brokered abuse: the ways that data brokerage enables and exacerbates interpersonal abuse. The harms of brokered abuse go beyond the fact that brokers make it easier to surveil people and expose them to physical, psychological, financial, and reputational harms. In addition, people must beg every single broker to conceal their information from thousands of separate databases, over and over again, with little or no legal recourse if brokers reject their efforts to regain some obscurity. Due partly to existing laws, this whack-a-mole burden of repeatedly pleading to obscure data can trigger trauma and distress. Only by grasping this fuller scope of brokered abuse can we begin to regulate it.Footnote 6
This chapter splits into three sections. Section 10.2 introduces the broker industry before Section 10.3 reveals how the law largely fails to address, and is even complicit in, key features of brokered abuse. Section 10.4 then explores the harms stemming from brokered abuse in order to lay some foundations for regulating them.
10.2 Data Brokers as Information Traffickers
Data brokerage is a multibillion-dollar industry.Footnote 7 Thousands of companies form a sprawling network of brokers that buy, sell, trade, and license gigabytes of human information. Though brokers’ business models vary, their power and profit fundamentally stem from trafficking information about people.Footnote 8
For the most part, brokers buy information from other companies and gather it from government records and public websites.Footnote 9 From there, brokers build profiles including data like a person’s name, aliases, photos, gender, birthdate, citizenship, religion, addresses, phone numbers, social-media accounts, email addresses, Social Security number, employers, schools, families, cohabitants, purchases, health conditions, and hobbies. These data dossiers are then sold for a fee or even shared for “free” thanks to the ads adorning broker websites.Footnote 10
There are, to be fair, some benefits tied to the broker industry.Footnote 11 Transparency and accessibility come from publicizing information online, including data drawn from public records. Journalists, activists, academics, and the general public can garner insights from this information.Footnote 12 Indeed, a person might even evade interpersonal abuse or other ills after discovering an acquaintance’s restraining order or criminal record through a broker. Though this kind of data is often accessible in other ways, a Google search is easier, faster, and cheaper than a trip to the county courthouse.Footnote 13
Some people also use brokers to locate heirs or reconnect with long-lost friends and family. Others might rely on brokered data to inform their hiring decisions. Some companies rely on brokers in order to collect debts or discover fraud, corroborating information given to them by a customer or client. And brokers can even assist the legal system, such as when class-action awards are being distributed. These perks cannot be ignored, but we should be wary of their value being exaggerated.
Another set of purported benefits relate to consumers, largely stemming from how businesses use brokered data. In particular, human information fuels the datasets and algorithms that help companies target ads and develop products. The resulting corporate revenue could, at least theoretically, yield cheaper or better services for consumers. I’m skeptical that this species of informational capitalism is in the public’s interest,Footnote 14 but debunking this defense of data brokerage is not essential. Even if the commercial benefits are substantial, we should not scoff at the serious harms tied to the broker industry.
Though there are many harmful facets of data brokerage, I’ll focus here on only one: how brokers enable and exacerbate interpersonal abuse. Most directly, brokers’ dossiers can be treasure troves for abusers, who can plunder them for information with just a few clicks and bucks. In Amy Boyer’s case, Youens paid a broker $45 for her Social Security number, $30 for her home address, and $109 for her work address.Footnote 15 These sums might already seem trifling given the vile result, but many brokers offer much more for far less. In 2013, for instance, a stalker bought Judge Timothy Corrigan’s home address for less than $2 and later shot bullets at his house, missing the judge’s head by a mere 1.6 inches.Footnote 16
These jarring anecdotes tell part of the story of how brokers enable and exacerbate abuse, but the phenomenon needs more interrogation to show its full scope. To do so, we must unpack how the law can be ineffective and even injurious when responding to brokered abuse.
10.3 The Law’s Role in Brokered Abuse
There are at least four common regulatory responses to brokered abuse: prohibiting abusive acts, mandating broker transparency, limiting data collection, and restricting data disclosure. Though each measure has some merit, none will suffice. Worse still, recent privacy laws can even inadvertently inflict psychological harms on people seeking to recover from abuse. Let us explore how.
10.3.1 Prohibiting Abusive Acts
Regulating abusive acts offers a path to reducing brokered abuse. If we target the underlying abuse, the thought goes, we needn’t regulate data brokerage. While this approach is attractive in theory and even viable in certain cases, it’s deficient for several reasons.
A host of laws directly regulate abuse, including criminal and tort liability for stalking, harassment, physical violence, doxing, privacy invasions, and voyeurism.Footnote 17 But even if these anti-abuse laws retroactively punish harmful acts or vindicate victims’ interests in some cases, the continued prevalence of abuse suggests that any prospective deterrence caused by the threat of liability is inadequate. Even when abuse is deterred, these laws do little to lessen people’s anxiety when their information is circulating online because they might lack confidence that any deterrence will hold.
To make matters worse, some anti-abuse laws can inadvertently increase people’s risks of abuse. For example, when liability depends on an entity intending or knowing that their actions will cause harm, brokers have an incentive to remain ignorant about how brokered data is being used. Consider California’s approach to protecting stalking victims who register with the state. A special anti-doxing law prohibits anyone, including brokers, from posting a registered victim’s home address, phone number, or image on the internet with the “intent” to “[t]hreaten” the victim or “[i]ncite” a third person to cause “imminent” bodily harm “where the third person is likely to commit this harm.”Footnote 18 With all these caveats, brokers can comfortably dodge liability by sharing data without asking questions.Footnote 19 Indeed, the standard data-brokerage business model – which relies on mass and indiscriminate data disclosures to anyone willing to pay – is incompatible with these kinds of scienter requirements because they implausibly suggest that brokers engage in case-specific deliberation or investigation before sharing data. And yet removing these caveats might pose a different problem because a law that broadly penalizes disclosing information might be vulnerable to constitutional challenges under the First Amendment.Footnote 20
Beyond substance, think practicalities. Anti-abuse laws often require a victim’s prolonged and active participation in pressing charges or filing lawsuits. There’s good reason to empower and involve victims in these legal processes, but the processes themselves can impose burdens that many victims are unable or unwilling to bear. Interacting with police, prosecutors, lawyers, and judges might dissuade some people, while some might also struggle practically or financially to bring civil claims – realities that disproportionately affect those who are already marginalized. Even setting aside these burdens, many people will fret about initiating matters of public record that could further jeopardize their obscurity and safety.Footnote 21
Finally, anti-abuse laws usually will not offer the obscurity remedies that some people will seek. Even if they do (or if such a remedy is eventually negotiated through settlement), legal proceedings move too slowly to address the exigent and immediate dangers that people face. To cap it all off, different brokers are constantly adding to their data stockpiles, so people would need to file new claims against new parties every time new information pops up online.
In short, laws prohibiting abusive acts fail to disturb essential features of brokered abuse. Some regulations might even aggravate matters by encouraging brokers to maintain ignorance when dishing out data, while other legal processes can be too burdensome, risky, or ineffective to be worth a victim’s while.
10.3.2 Mandating Broker Transparency
Another regulatory tool involves shedding light on data brokerage. While transparency laws can be helpful, they are ultimately insufficient. These laws come in different shapes and sizes, but we can distinguish two types based on their principal goals: administrative transparency that informs regulators and popular transparency that informs individuals. Each has value, but neither meaningfully abates brokered abuse.
Administrative transparency follows a two-step system to educate regulators about the broker industry. Brokers first register with a state agency to create a list of brokers doing business in the jurisdiction, then brokers disclose details about their practices (such as where they obtain data and how they handle complaints). Vermont and California have such laws, and similar themes animate the Data Broker List Act introduced in Congress in 2021.Footnote 22
Popular transparency, by contrast, mainly informs individuals. California, for instance, has passed “right to know” laws that force brokers to reveal details they’d rather conceal: what data they have and whether they have shared it.Footnote 23 Similar laws might even oblige brokers to grant people no-cost access to data about themselves, rather than forcing them to pay a fee.
Administrative transparency can help regulators grasp the broker industry and inform future legislation, while popular transparency can help motivated people learn something new about their exposure with particular brokers. But neither approach helps a person facing urgent threats from their information appearing online. There’s also no guarantee that transparency will motivate further regulation; if anything, these milquetoast measures might sap political will from stronger proposals.Footnote 24 At best, then, transparency laws fiddle with some incentives underlying data brokerage. (Maybe brokers will disclose less data if they have to disclose how they are disclosing data?) At worst, these laws let brokers hide their harmful practices in plain sight while boasting about their regulatory compliance.
10.3.3 Limiting Data Collection
A third response to brokered abuse involves curtailing data collection. Again, there are promises and pitfalls to this approach. Laws of this ilk form a privacy mosaic for our information, but there are too many missing pieces to make a pleasing mural.
Longstanding regulations forbid obtaining data through deception, other laws bar intrusive surveillance like hacking, and various legal regimes give companies “gatekeeper rights” to deter data scraping from their websites.Footnote 25 These restrictions reach only a subset of brokers’ activities because gobs of data can be gathered without running afoul of any law.Footnote 26 Most importantly, many of these laws do not apply when brokers get data from public records or other publicly accessible sources.Footnote 27
More recently, a new vintage of data-privacy laws has unsettled the broker industry by prohibiting the nonconsensual collection of people’s information. But even these stricter rules often contain caveats that let brokers thrive. The California Consumer Privacy Act, for example, provides that the types of “personal information” protected by the law do not include “publicly available information or lawfully obtained, truthful information that is a matter of public concern” – an exception that covers vast troves of brokered data and endorses many broker practices that leave abuse victims vulnerable.Footnote 28
In light of these carveouts for publicly accessible information, one approach to limiting data collection focuses on the state’s role in furnishing brokered data.Footnote 29 Brokers sustain their services with information from public records like property deeds, voter rolls, and marriage licenses. To partially stem this flow, most states have confidentiality programs to allow abuse victims to conceal certain information from state documents.Footnote 30 On the plus side, these measures are unlikely to raise First Amendment red flags because nothing forces the government to collect (or publish) the kind of identifying information that most likely endangers people’s obscurity.Footnote 31 But while limiting government data collection (and publication) brings significant benefits, even the broadest restrictions are insufficient. Public records, after all, are but one source of human information. Most importantly, brokers can still buy data from other companies and gather it from other public websites. Tinkering with public records turns off the cold tap but leaves the hot water flowing.
10.3.4 Restricting Data Disclosure
A final way to tackle brokered abuse involves controlling data disclosure. This approach conceivably offers great potential for people seeking to stop brokers publicizing their information online. But the devil is in the details. Many disclosure regulations either do little to thwart abuse or even harm people trying to protect themselves.
Some disclosure rules aren’t aimed specifically at either data or brokers, such as tort liability for publicly disclosing certain sensitive information,Footnote 32 while other recent proposals would make brokers pay for selling people’s data or ban them from sharing location and health information. Such constraints meddle with brokerage around the margins, but none brings fundamental reform and some plausibly exempt publicly accessible data.Footnote 33
Given these limitations, let us focus instead on modern laws providing rights to conceal or remove information from broker databases or websites. California offers a rare example in the United States of providing these legally mandated obscurity rights, so we’ll use it as a short case study to examine the virtues and vices of such a regulatory regime. Under the state’s general “right to opt-out,” all Californian consumers may direct businesses not to sell their personal information to third parties, meaning that the company must not disclose their data for profit once a person exercises their obscurity right.Footnote 34 But California law also goes one step further. Abuse victims who register with the state’s Safe at Home program have more expansive obscurity rights. Of particular note, brokers cannot knowingly display a victim’s phone number or home address on the internet. If a victim asserts their reasonable fear related to that information, a broker must conceal the data for four years and could face injunctions, court costs, and attorney’s fees for noncompliance. And if anyone, including a broker, displays or sells the information with intent to cause certain harms, victims may seek treble damages and receive a $4,000 fine per violation. To help implement the law’s protections, California provides an online opt-out form that victims can use to invoke their obscurity rights.Footnote 35
Though California’s goals are laudable, this innovative approach fails to grapple with the realities of abuse. Under these laws, Californians must engage in extensive “privacy self-management” because the state forces them to exercise obscurity rights on a company-by-company basis.Footnote 36 Even the Safe at Home opt-out process – which was presumably designed with abuse victims in mind – operates from this fragmented premise by requiring victims to approach brokers individually and submit forms to each one regularly. Brokers, after all, continuously replenish their stocks, and concealing some data does not stop other data from soon taking its place. Given these features of the broker industry, laws like California’s could actually entrench a disaggregated and detrimental obscurity process because brokers can seize on their legal compliance to justify not offering better services.
10.4 The Harms of Brokered Abuse
With this legal survey in mind, let us return to the matter of harms: How do brokers enable and exacerbate abuse? How is the law inadequate and complicit? And how might legal procedures even contribute toward a person’s suffering?
To answer these questions, I return to obscurity – a notion of privacy concerned with “the difficulty and probability of discovering or understanding information.”Footnote 37 As Woodrow Hartzog and Evan Selinger have observed, obscurity can be a “protective state” that serves valuable privacy-dependent goals like “autonomy, self-fulfillment, socialization, and relative freedom from the abuse of power.”Footnote 38 Understanding the full scope of brokered abuse requires parsing how data brokerage, including its surrounding legal constructs, undermines obscurity. As we’ll see, brokered abuse encompasses an array of intrinsic and extrinsic harms, all of which implicate a person’s obscurity.
10.4.1 Intrinsic Harms
Abuse. As an initial matter, brokers routinely create privacy losses by sharing people’s information. Though this core of brokerage is not intrinsically harmful, such privacy losses can engender privacy harms.Footnote 39 Some of these privacy harms, to use Ignacio Cofone’s terminology, are “consequential” because they are “external to privacy interests but occur as a consequence of privacy violations.”Footnote 40 Brokers facilitate the surveillance of victims and their kin by systematically sharing personal information. An abuser armed with brokered data can perpetrate a slew of “consequential” physical, emotional, economic, and reputational harms. This might not be a broker’s goal, but it’s certainly their role.
Risk. Beyond the direct harm of actually enabling abuse, brokers commit the kindred harm of increasing the risk of abuse by making it easier to surveil a person and their family, friends, or associates. This risk, in turn, can cause anxiety even if no abusive act ever occurs.Footnote 41 Without regulatory intervention, these threats will only grow as data proliferates and new technologies, like facial-recognition surveillance, further wreck obscurity.Footnote 42
Isolation. Brokers also rob people of agency to “control their visibility within public space.”Footnote 43 As Scott Skinner-Thompson has argued, digital and physical surveillance can cause forced publicity, which might then deter people from participating in public life.Footnote 44 This cycle, unsurprisingly, has unequal repercussions for those who are socially marginalized already – a special concern here because victims often hail from marginalized groups and because abuse can have ostracizing effects regardless of one’s preexisting social status and personal characteristics.Footnote 45 Data brokerage can intensify a victim’s isolation by foisting visibility on them, creating yet more reasons for them to retreat entirely from public spaces.
10.4.2 Extrinsic Harms
Some people respond to this trio of intrinsic harms – abuse, anxiety, and isolation – by trying to cull information from broker databases. Easier said than done. As we have seen, people must beg brokers to conceal their data with little guarantee of success, especially in jurisdictions where legal remedies are absent or incomplete. At best, people in places like California can contact every single broker separately to exercise their legal rights. The result? People facing physical and psychological peril must approach each broker individually over and over again. At a time of high vulnerability, this obscurity process creates a pair of extrinsic harms that are partly constructed by legal rules and procedures.
Annoyance. The first harm can be styled as annoyance, though it covers a range of unwanted emotions. Some people might reasonably feel indignant about having to demand their obscurity. (Imagine someone complaining: “It’s my data, not theirs, so they should have to ask me before using it! Why should I have to contact them?”) Others might resent spending time filling out forms or navigating brokers’ laborious and complex bureaucracies. Some people might feel exasperated at how futile it all seems, especially given that “grey holes” in privacy law might give brokers enough room to resist obscurity requests or refill their databases.Footnote 46 Absent some compelling justification, the law should not be complicit in cultivating negative reactions to exercising legal rights. Feeling indignant, resentful, or exasperated is both unpleasant and likely to dissuade people from enforcing their rights.
Trauma. Taking annoyance seriously is important to understanding the law’s failure to address brokered abuse. But to culminate this chapter, I want to stress something different and underappreciated. For abuse victims, an arduous and dispersed obscurity process can inflict a harm that goes beyond mere hassle or frustration. It’s more than a matter of transaction costs. It’s more even than a question of abuse and anxiety. Instead, it’s about trauma – and how the law’s failure to consider the role of trauma represents a failure of empathy toward victims of abuse.
The basic point is this: The process of preventing brokers from sharing information can trigger psychological harm by forcing victims to repeatedly revisit their abuse and recognize their vulnerability. A disaggregated and inefficient obscurity process might irritate some people, but the burden it can impose on victims is likely distinct and severe. In short, the obscurity process itself can be traumatic.
“Trauma is the experience and resulting aftermath of an extremely distressing event or series of events, such as disaster, violence, abuse, or other emotionally harmful experiences.”Footnote 47 Though further research is required to explore how trauma manifests in the context of brokered abuse, existing studies point to likely connections between abuse, trauma, and technology. For example, a recent interdisciplinary study by researchers working directly with victims in the Clinic to End Tech Abuse at Cornell University examines how people’s interactions with digital technologies can cause trauma in the context of interpersonal abuse. As the authors observe based on a series of actual case studies, people’s experiences with technology can “trigger existing trauma and even retraumatize a person,” such as “when something in one’s environment causes them to recall a traumatic experience, often with a recurrence of the emotional state during the original event.”Footnote 48 Based on my own experiences – personally as an abuse victim and professionally when speaking with other victims – this accurately describes how prevailing obscurity processes involving data brokers can trigger trauma.
Even the most expansive obscurity rights fail to grapple with this extrinsic harm. Indeed, these laws risk aggravating matters by enshrining a decentralized process into law. While current procedures might be annoying for someone who’s never faced abuse, for victims seeking obscurity it creates an extra injury that might further discourage them from enforcing their legal rights. Legislators have failed to account for the dynamics of interpersonal abuse from a victim’s perspective. The law, it might be said, lacks empathy.
To compound matters, current processes to regain obscurity are often ineffective. Brokers can simply shun removal requests in the forty-odd states that lack data-privacy laws, and even a responsive broker can do no more than purge information from its own database. An abuser needs only one willing broker to facilitate surveillance, and the scattering of digital breadcrumbs among brokers can distress people even if an abuser never actually gets any data. A flawed obscurity process, then, solidifies all three intrinsic harms by enabling abuse, creating anxiety, and causing isolation, while also maintaining extrinsic harms like annoyance and trauma.Footnote 49
I leave the matter of addressing brokered abuse for another day, but one thing seems clear: There’s a dire need for an effective and empathetic obscurity process.Footnote 50 Though it’s impossible to say how many people are harmed through brokered data, we know that many forms of technology-enabled abuse are rampant, rising, and ruinous. Recent empirical research has shown how abusers are exploiting technologies to intimidate, threaten, monitor, impersonate, and harass.Footnote 51 This essential work substantiates earlier scholarship revealing how technology can facilitate interpersonal harms and deepen social inequities.Footnote 52 We know, too, that abuse victims suffer significantly higher rates of depression, anxiety, insomnia, and social dysfunction than the general population.Footnote 53 Given these realities, we should not turn a blind eye to brokered abuse.
10.5 Conclusion
Data brokers are abuse enablers. By sharing people’s information, brokers thwart obscurity, stimulate surveillance, and ultimately enable interpersonal abuse. This chapter has canvassed four regulatory responses to brokered abuse. Though these existing measures have some merit, none is adequate, and some laws can even make matters worse. Put simply, the current legal landscape is neither effective nor empathetic.
Of particular and yet underappreciated concern, the prevailing broker-by-broker approach to regaining obscurity likely causes victims’ trauma by forcing them to engage repeatedly with their abuse and vulnerability. The flaws of this obscurity process also leave people vulnerable to serious physical, psychological, financial, and reputational harms. Regulating brokered abuse should be a priority for both law-makers and technologists.