Artist Stephanie Dinkins is pioneering a new way to approach artificial intelligence (AI) — not as a neutral tool, but as a system deeply shaped by the data it learns from. Her work challenges the tech industry to address inherent biases in datasets that perpetuate systemic inequalities.

The Problem with Current AI Systems

For years, AI algorithms have been trained on data that often reflects existing social biases. This means that AI can reinforce discrimination, for example, by misidentifying people of color or providing harsher sentencing recommendations in criminal justice. Dinkins’ work exposes the “violence” embedded in these datasets: the limited roles assigned to marginalized groups in media, the historical biases within legal systems, and the general lack of representation in foundational AI training materials.

The Turning Point: Meeting Bina48

Dinkins’ journey began with an encounter with Bina48, an advanced social robot modeled after a Black woman. She quickly discovered that the AI lacked the nuanced understanding of race that a real person would have, raising a critical question: if even well-intentioned developers fail to address bias, what happens when no one cares?

This realization led her to a core project called “Not the Only One,” based on oral histories from her family. She found it nearly impossible to find existing data that felt “loving enough” to support her family’s stories, forcing her to create her own dataset. The result was imperfect but ethically sound: a wonky AI that sometimes speaks in non sequiturs rather than perpetuating historical cruelty.

The Solution: Community-Driven Data

Dinkins advocates for “gifting” AI systems with data from underrepresented communities. Her app, “The Stories We Tell Our Machines,” allows people to contribute personal narratives, ensuring that AI learns from the inside out. She stresses that while data exploitation is real, the alternative — letting AI define communities based on biased sources — is worse.

The ultimate goal is to create widely distributable datasets that can fine-tune AI systems without stripping them of cultural context. Dinkins envisions a future where underprivileged individuals can leverage AI tools to compete with established industries, such as creating high-quality films independently.

“What we hear out in the world is, ‘No, they’re taking our data. We’re being exploited,’ which we are. But also, we know that if we do not nurture these systems to know us better, they are likely using definitions that did not come from the communities being defined.”

Dinkins’ work is a call for AI developers and researchers to prioritize ethical data sourcing, community engagement, and cultural sensitivity. It’s a reminder that AI isn’t just about algorithms; it’s about power, responsibility, and the stories we choose to tell our machines.