CMU logo
Search
Expand Menu
Close Menu

HCII PhD Thesis Proposal: Alicia DeVrio

Open in new window

When
-

Description

Everyday Resistance to Algorithmic Harm
 

Alicia DeVrio

HCII PhD Thesis Proposal

Date & Time: Thursday, April 10, 2025 @ 2:30 p.m. ET

Location: Newell-Simon Hall (NSH) 3305

Remote: Zoom Link (Meeting ID: 934 4495 6234; Passcode: 339171)

Committee:

Ken Holstein (chair), Carnegie Mellon University

Jess Hammer, Carnegie Mellon University

Sarah Fox, Carnegie Mellon University

Richmond Wong, Georgia Tech

Abstract:
 

Algorithmic systems often behave in ways that harm people, such as by perpetuating racial stereotypes, violating privacy, damaging health, and causing economic losses. Unable to count on the powerful to adequately address algorithmic harm, everyday people regularly take both small and large actions, from individual workarounds on algorithmic platforms to collectively organized protest and destruction of algorithmic systems, to resist algorithmic harm they encounter. These responses can be highly effective, yet they are frequently seen by those with power over algorithmic systems as insignificant, inexpert, unhelpful, or unreasonable.

In the first part of this dissertation, I show how everyday people’s responses to algorithmic harms can be powerful in effecting change, as exemplified through algorithm auditing. To do this, I first describe how algorithm audits are limited, as they tend to consolidate power, and argue that algorithm audits should be repositioned and reconceptualized to legitimize more forms of knowledge and evidence. Next, I do just that by proposing and exploring the concept of everyday algorithm auditing, a process by which everyday people surface, interrogate, and work toward remediation of algorithmic harm that may elude detection via more centrally-organized forms of auditing. I then study how people go about searching for and making sense of potentially harmful algorithmic behavior, highlighting ways they might be further supported in their auditing activities. Finally, I examine how industry practitioners view and employ user-engaged auditing practices themselves and draw out a complex relationship between practitioners and everyday auditors.

Having established that everyday people’s responses to algorithmic harm can be powerful, in the second part of this dissertation I focus on better understanding the dynamics of this power. First I taxonomize the ways, including everyday algorithm auditing, that structurally disempowered people respond to algorithmic harm they encounter, connecting different types of responses to existing theorizations of power. Next, I study how everyday people understand their own power in algorithmic systems, highlighting ways to support them in more fully realizing and taking action using their power. Finally, in my proposed work I will examine how those with power over algorithmic system deployment handle cases in which everyday people attempt to destroy their systems. Through a series of speculative design workshops with both everyday people and those with power over algorithmic systems, I will explore challenges and opportunities for the powerful to more productively engage with and value the ways that everyday people resist algorithmic harm.
 

Proposal document link

Hope to see you there!

Alicia