CMU logo
Search
Expand Menu
Close Menu

AI Should Be Kind and Fair, Young Black Girls Tell CMU Researchers

News

SCS researchers worked with Black girls to better understand their ideas about AI. As part of the research, the girls designed humanoid AI-powered robots, which displayed hairstyles similar to the learners’ — a robot representation not often shown in popular media.
SCS researchers worked with Black girls to better understand their ideas about AI. As part of the research, the girls designed humanoid AI-powered robots, which displayed hairstyles similar to the learners’ — a robot representation not often shown in popular media.
SCS researchers worked with Black girls to better understand their ideas about AI. As part of the research, the girls designed humanoid AI-powered robots, which displayed hairstyles similar to the learners’ — a robot representation not often shown in popular media.

When asked about future uses for artificial intelligence, a group of fifth- and sixth-grade Black students attending a summer camp at Carnegie Mellon University had some unique ideas. One suggested AI could create an expandable bubble that traps bad people inside; another said AI could help shy people be more outgoing.

The camp, made up of 10 girls taking part in a research study, gave a team in the School of Computer Science's Human-Computer Interaction Institute (HCII) a chance to work with Black girls and explore their ideas about AI.

"Black girls are a group that are not well-represented in tech and don't always get as much access to fitting education," HCII Ph.D. student Jaemarie Solyst said. "We wanted to consider: How do Black girls think about artificial intelligence? And how do they define fairness, particularly in AI?"

Part of developing AI literacy experiences that center Black girls is exploring what they know already, possible knowledge gaps and pathways toward empowerment. Fairness, equality and kindness were important as the girls thought about how AI should act and what it could do. They were also able to see themselves as possible designers and leaders of future AI technology.

The team's resulting research, "'I Would Like To Design': Black Girls Analyzing and Ideating Fair and Accountable AI," earned an honorable mention best paper from the ACM CHI Conference on Human Factors in Computing Systems (CHI 2023) earlier this year.

As AI becomes a part of everyday life, education about the technology becomes increasingly important. Recent advancements in AI have shown the benefits of the technology, such as creating efficiency and saving resources through automation. But the harm that AI can cause has also become apparent, like when the data used to train AI models reflect societal biases. Because of the effects of those biases on marginalized communities, the research team chose to work with Black girls at an age when STEM education is most important.

"It's a really critical time to have them think about themselves as technologists or having other roles in the development and upkeep of AI," Solyst said.

During the camp workshops, the team taught the basics of AI concepts and discussed common AI technologies, such as Amazon Alexa and Google Translate. They then contemplated different real-world-based scenarios where such AI could have harmful or unfair impacts on users. When discussing fairness, the girls trusted AI to make fairer decisions than humans.

"We saw this especially with an example we gave the students of a school lottery algorithm sorting certain students into certain schools," said Ellia Yang, a sophomore in the HCII who worked on the research team. One learner suggested that "humans would put white kids in one school and Black kids in another school, but computers would mix them all up."

Future AI literacy programs can address this perception by discussing how societal biases end up reflected in AI algorithms, Solyst said.

The girls saw equality — where everyone receives the same resources — as important, but had a more difficult time understanding equity — where certain people receive resources based on their needs. For example, the girls were asked whether Alexa not recognizing certain accents was fair, which is a scenario often used to illustrate unfairness in AI-driven voice recognition technology. At first, they said because the user could work around or adapt the technology, it was fair. Toward the end of the discussion, however, the girls considered that the inequity reflected in such a scenario would be unfair.

"These are interesting steps because children's moral development takes time," said Motahhare Eslami, an assistant professor in the HCII and the Software and Societal Systems Department. "When to introduce them to these different topics is such an interesting thing to study or explore in the future."

Fairness was also linked to kindness. The girls found that situations in which humans or technology were "mean" to be unfair.

The girls envisioned the expandable bubble and shyness-deterring app as part of an exercise to get them thinking about AI designs and how they could influence those designs in the future.

In their paper, the research team suggested that AI STEM education should not only teach girls to code, but should also emphasize leadership roles to help them imagine themselves as designers, creators and project leaders. The girls were excited about being designers, where they could call the shots when creating AI.

"We wanted them to start thinking about how they would make a change with AI technology in the future, not only for the people around them in the workshop but also for their broader communities," Solyst said. "It helps to position learners as techno-social change agents who can be poised to make change, be a part of movements and advocate for justice in a technical environment."

Related People
Jaemarie Solyst, Motahhare Eslami, Jessica Hammer, Amy Ogan

Research Areas
Artificial Intelligence (AI), Fairness, Accountability, Transparency, and Ethics (FATE)