Black, LGBT, Disabled People Face AI and Algorithmic Bias, Privacy Advocates Say
Program Date: Nov. 30, 2022

Private companies’ data collection builds on “a history of surveilling and criminalizing black and brown people and disabled people’s bodies and movements, and data-driven discrimination in any one of these areas will feed into systemic racism and ableism in other areas,” Ridhi Shetty, a policy council for the Center for Democracy and Technology, told NPF data privacy fellows. [Transcript | Video]

How IoT and IoB devices affect people with disabilities

New technologies – from Internet of Things devices, such as home security systems, to Internet of Bodies technologies, such as wearables – can help disabled individuals to live more independently. However, these technologies also collect and share huge amounts of data on their users.

Data brokers can obtain location data and health data from countless sources, so for instance, dating apps like Grindr where data sharing can endanger LGBTQ+ users, family safety apps like Life360, Muslim prayer apps that share location data, weather apps, mental health apps like Crisis Text Line, BetterHelp and Talkspace that potentially commercialize people’s mental health and even mobile game apps,” Shetty said.

In the U.S., particularly, data is valuable.

“Organizations want as much data as they can to monetize it,” said Willmary Escoto of Access Now. “If you know that this technology is damaging to a specific subset, just don’t use it. It should be that simple. But it’s not because there’s millions of dollars involved.”

Shetty noted that dense and expanding privacy notices put an impossible burden on the consumer.

“Either you can benefit from the uses of these technologies or you can protect your privacy, but you can’t do both,” Shetty said. “Marginalized communities can’t afford to prioritize privacy when it’s going to come at the expense of the benefits or opportunities that they’re seeking to access in these technologies that are misusing their biometric data and their behavioral data.”

How emotion recognition technology (ERT) may hurt Black students

Emotional recognition may be used in a number of settings, from attempting to predict when protesters are growing aggressive and may turn violent, to assessing children in a classroom.

However, Escoto noted that studies have shown that ERT systems have shown racial bias, for instance, by associating negative emotions more with Black men’s faces than white men’s faces. In a classroom, this technology is meant to tell if a student is inattentive or confused.

“Why should you care? Simply put, in education, it could further exacerbate inequalities and oppressive dynamics that we’ve seen for decades,” Escoto said. “You can kind of see how that can cause serious damage or serious consequences to children at a very young age, how they’re perceived, how much attention they’re given, how teachers treat them.”

Even if these technologies worked perfectly without bias, “it erodes our right to privacy and freedom of thought,” Escoto said.

How gender recognition technology affects the LGBTQ+ community

Gender recognition technology works by analyzing a person’s physical appearance, such as bone structure, against their legal name and other information. Access Now says this is harmful for LGBTQ+ people, particularly those who identify as transgender or nonbinary.

“They can be interrogated by authorities at the airport, … you could be prohibited from going to gender-specific spaces like a bathroom or a locker room. And authorities in repressive countries can analyze security camera footage or social media profiles to track down individuals that they believe are LGBT+ and arrest them,” Escoto said.

Generally, facial recognition is less accurate for these populations, Shetty explained.

“People of color, transgender and gender-nonconforming people and girls and women are less likely to receive the accurate diagnosis of disabilities that affect facial features,” Shetty said, and “formal diagnosis is often a gatekeeper for accessing school and workplace accommodations and for accessing supports for managing one’s health.”

How AI affects hiring (and firing)

In the job interview process, artificial intelligence software could be used to “capture vocal ticks, facial expressions, movements, fidgeting and all those are used to kind of gauge somebody’s trustworthiness or their confidence or their enthusiasm,” Shetty said. “You’re basically using how somebody speaks, how somebody moves … to make an assessment about whether they’re worthy of getting a job rather than focusing on characteristics that are really essential for performing the core job functions.” She called it a misuse of biometric data.

It’s not just for prospective workers. Some employers monitor employees’ location, key strokes and mouse movements. Rather than accurately tracking productivity, this monitors “when they are performing the job rather than just focusing on whether they’re getting the job done,” Shetty said. “Applications also collect health data to administer employee wellness programs that impose expectations around diet and exercise that not all workers will be able to meet, especially disabled workers. And employers use these programs to influence workers behaviors and their health decisions in an attempt to increase their productivity and also to reduce their own costs when administering benefits.”

Possible solutions

Both speakers called for more transparency from companies about how their “black box” algorithms work, so that they can be better analyzed by civil society groups and the public can have better-informed consent.

Strong data protection would include data minimization (not collecting more data than necessary) and purpose limitation (not repurposing the data for something that has nothing to do with the reason why people are using that service in the first place), Escoto said.

CDT supports a moratorium on facial recognition used by law enforcement. Access Now wants a ban on facial recognition and remote biometric recognition technologies that enable mass surveillance and discriminatory targeted surveillance.

Companies are able to track what we’re doing, where we’re at, who we’re with,” Escoto said. “We don’t give our consent and we don’t know what they’re doing with that data, who they’re selling it to.”


This program was sponsored by Arnold Ventures and Medtronic. NPF is solely responsible for the content.

Willmary Escoto
U.S. Policy Analyst, Access Now
Ridhi Shetty
Research Policy Counsel, Privacy & Data, Center for Democracy & Technology
1
Transcript
When Inequity is Built into Private Sector Data Uses
Subscribe on YouTube
10
Resources
Resources for Discrimination Driven by Data
Help Make Good Journalists Better
Donate to the National Press Foundation to help us keep journalists informed on the issues that matter most.
DONATE ANY AMOUNT