I. Introduction
In June 2022, the United States Supreme Court decision in Dobbs v Jackson Women’s Health Organization (Dobbs) rolled back reproductive rights and access to abortion across the country. With the Dobbs decision, historic abortion bans previously deemed unconstitutional by the 1973 Roe v Wade decision took effect, as did restrictive abortion laws passed since 1973.Footnote 1 In some of the most restrictive states, this has put pregnant people’s lives at risk,Footnote 2 and may lead to the criminalization of individuals for facilitating access to abortions.Footnote 3
The ruling in Dobbs has broad-reaching impacts not only on reproductive healthcare and health rights, but also in the context of technology and human rights. Digital and human rights experts have long cautioned that companies’ collection of vast amounts of user data poses significant risks to people’s right to privacy as well as other fundamental rights and freedoms.Footnote 4 This situation is not unique to the United States. As long-established rights are stripped away in countries around the world, companies that collect, store, share and process user data may increasingly face situations where that data are used to infringe upon their users’ rights.
Since the Dobbs decision, civil society has called on companies to protect user data, limit data collection, and provide reassurance that users will not be put at risk of punishment for exercising their reproductive rights.Footnote 5 Companies must adapt their policies and practices to new threats to human rights, and the Dobbs decision requires that they double up on their efforts to prevent and mitigate the increased risks to reproductive rights. This piece outlines the risks to reproductive rights of collecting, storing, sharing and processing user data in general, and in light of the Dobbs decision in particular, and shares how companies can address these risks.
II. Technology, Privacy and Reproductive Rights
Technology dominates our daily lives and companies have amassed a broad range of user data, including health data and data that can be used to infer sensitive health-related information about individuals. Experts note that such data collection has disproportionate impacts on the privacy and security of women, girls and persons of diverse sexual orientations and gender identities.Footnote 6 Technology, and data collected by companies, has regularly been used to target, harass and surveil women and girls, including for their sexual and reproductive health choices.Footnote 7 Following the decision in Dobbs, the risks to women and girls have increased significantly, and companies are again confronted with how their collection of user data could be used to violate reproductive rights and associated fundamental rights.
Right to Privacy Online
In the digital age, ensuring privacy protections for online data is critical to protecting many human rights, including access to abortion and reproductive rights, and to preventing the persecution of those who have accessed those rights.
In 2019, the United Nations Human Rights Council (HRC) clarified that the right to privacy applies online, and acknowledged the potential negative impacts that emerging technologies can have on privacy.Footnote 8 The HRC further noted the particular impacts technology may have on women and members of vulnerable and marginalized groups,Footnote 9 and encouraged businesses that collect, store, share and process data to meet their responsibility to respect human rights, including the right to privacy, in accordance with the United Nations Guiding Principles on Business and Human Rights (UNGPs).Footnote 10
The failure to protect online privacy can have far-reaching implications for a broad spectrum of human rights, and as highlighted by the HRC, businesses must take steps to put in place adequate policies and safeguards to protect these rights.
Third Party Access to User Data
Companies that collect, sell, share or aggregate user data must consider how that data can be used by third parties to infringe upon human rights, including reproductive rights.Footnote 11 For example, investigations into reproductive health apps revealed that these apps shared detailed and sensitive data with social media platforms, raising concerns about the companies’ privacy practices and user consent.Footnote 12 Data brokers that collect or purchase user data are still largely unregulated in the United States, and companies may not even be aware of how the data they collect, sell or share is used, including whether it ends up in the hands of law enforcement and other state bodies.Footnote 13 Recent research found that data brokers sold the location data of people who visited abortion clinics in the United States, revealing the risks such data pose to the exercise of reproductive rights.Footnote 14
Data shared with third parties could also be used to target specific populations to sell products and services or promote disinformation with the aim of restricting sexual and reproductive health decisions, contributing to gender-based discrimination.Footnote 15 In one example, anti-abortion groups used data-driven surveillance to target women contemplating abortion for the purpose of sending them anti-abortion advertisements.Footnote 16 Experts have warned that these groups could go further, using that data to target people that have facilitated access to abortions post-Roe. Footnote 17
Experts have further highlighted the risks of sharing user data in response to government requests. While companies regularly share information with law enforcement agencies to prevent crimes such as money laundering or online child sexual abuse, governments also request information from companies for investigations that could lead to human rights violations.Footnote 18 With abortion criminalized in several states across the United States after Dobbs, there is an elevated risk of law enforcement agencies using personal data to infer sensitive information regarding an individual’s reproductive choices.Footnote 19 This can lead to the arrest and criminal prosecution of women and girls who undergo abortions, and for those who assist them.
Company Practices Not Fit for Purpose
A recent analysis from the Business & Human Rights Resource Centre of the policies of 63 companies operating in the United States relating to third party access to user data revealed significant shortcomings in these policies, as well as discrepancies between policies and company statements about their actual practices. Overall, the surveyed companies were not able to demonstrate robust human rights due diligence processes to identify, prevent and mitigate the risks to reproductive rights of allowing third-party access to user data.Footnote 20
The company responses indicated a lack of comprehensive knowledge on how their data collection, storage, sharing and processing could contribute to privacy violations.Footnote 21 They also indicated significant gaps in company policies regarding third-party access to user data,Footnote 22 increasing the risk of data being used to target people seeking or facilitating access to abortions. Digital rights experts have also repeatedly pointed out vulnerabilities in companies’ policies and practices that compromise the privacy and security of user data, including with regard to health-related data.Footnote 23 In the United States, this has resulted in multiple instances where users’ privacy was compromised, and online data were used to prosecute individuals ending their pregnancies or facilitating access to abortions.Footnote 24
As companies have become increasingly aware of the human rights risks associated with sharing user data with governments, many have taken steps to mitigate these risks, including by requiring a warrant before disclosing user data and publishing transparency reports on government requests.Footnote 25 However, these steps are often not enough to protect users’ reproductive rights. For example, governments can use broad warrants known as geofence and keyword search warrants to request large groups of data and identify individuals that may have visited abortion clinics.Footnote 26 Government entities have also obtained user data from private data brokers, circumventing the judicial process entirely.Footnote 27
While several companies have policies outlining how they will respond to government requests for user data and note that they will challenge unlawful requests, many may not be equipped to deal with this new legal landscape in the United States.Footnote 28 Indeed, user data obtained via law enforcement requests have already been used to charge individuals for ending their pregnancies or facilitating access to abortion. In 2015 and 2017, women in Mississippi and Indiana were prosecuted for ending their pregnancies in cases which used their search histories and text messages as evidence.Footnote 29 After Dobbs, cases like these are likely to increase. In 2022, Meta provided Facebook Messenger records to police who brought felony charges against a mother who helped her 17-year-old daughter access abortion pills.Footnote 30
III. The Corporate Responsibility to Ensure Privacy and Security of User Data
The human rights risks outlined above demonstrate a clear need for companies to proactively adopt policies to limit and protect the user data they collect, store, share and process. These human rights risks are not new, nor are the impacts on reproductive rights following the Dobbs decision. In the most restrictive states across the United States, user data were already being used to prosecute people for terminating pregnancies before Dobbs struck down the constitutional right to abortion.Footnote 31 To address these concerns, companies should take steps to prevent, mitigate and remedy the human rights harms associated with the data they collect, in line with international human rights standards and norms.
The Office of the United Nations High Commissioner for Human Rights and Special Rapporteur on the promotion and protection of the right to freedom of expression have both emphasized the importance of adopting and implementing the UNGPs to protect human rights in the digital age.Footnote 32 However, as evidenced by the Business and Human Rights Resource Centre’s research mentioned earlier, many companies have not taken the necessary steps to identify, assess and mitigate the risks to reproductive rights of collecting user data.Footnote 33
In addition to adopting human rights due diligence processes, there are several steps companies can take to protect user data from being used to restrict reproductive rights and other fundamental rights and freedoms. This includes limiting data collection, allowing anonymous or pseudonymous access to products and services, allowing users to choose the types of data collected or to completely erase all data, and strengthening data encryption.Footnote 34 Companies can further ensure transparency by publishing easily accessible information about privacy policies, third-party access to user data and government requests for user data broken down by location, type of information requested and reasons for the requests.Footnote 35 Companies should also find ways of warning users of the types of data that could be used by third parties to infer health-related and other sensitive information and put their reproductive and other human rights at risk.
Companies should further refrain from sharing users’ data with third parties unless informed consent for sharing the specific category of data is provided, and a human rights due diligence assessment has been conducted on those third parties.Footnote 36 With regard to government requests for user data, companies should adopt clear policies that set meaningful limits to the data provided, and challenge unlawful and overly broad requests.Footnote 37 If companies are legally required to share data, they should limit the types of information to only that which narrowly responds to the request, and notify users at the earliest opportunity when information is shared.Footnote 38
These actions represent best practices as identified by human rights and digital rights experts,Footnote 39 and are all the more critical after the Dobbs decision. While many companies have already adopted such policies, a significant number is still falling behind.Footnote 40 In addition, these steps alone may not be sufficient when protections for long-established rights are repealed. Therefore, companies must remain vigilant to changes in regulatory frameworks and assess how their data collection practices may increase risks to human rights in light of those changes. This includes considering the aggravated or disproportionate impacts that collecting user data can have on certain groups of rights-holders, such as women, girls and people with diverse sexual orientations and gender identities.
Competing interest
Meagan Barrera is employed by the Business & Human Rights Resource Centre and Danny Rayman Labrin is a consultant with the Business & Human Rights Resource Centre.