The Race, Abolition, and AI Program: Empowering Young People to Navigate the Intersection of Race and Technology

by Colin Angevine

A Case Study from the Connected Wellbeing Impact Studio

“It’s just nice to feel heard and seen for once — being treated like your knowledge matters.”
– Race, Abolition, and AI Program youth participant

On the first day of the Race, Abolition, and AI Program, a summer program for rising high school juniors and seniors, Dr. Tiera Tanksley gets everyone moving with a warm up exercise: “Move to the left side of the room if you love STEM. Move to the right side of the room if you hate STEM.” Within seconds, all students except one are firmly planted by the right-hand wall.

For most summer programs that focus on topics like algorithms, machine learning, and artificial intelligence, this skewed response might seem like a bumpy start. But the Race, Abolition, and AI Program deliberately does things differently. The warmup activity was right on track.

Over the course of the next three weeks, Dr. Tanksley leads workshops and lessons on sophisticated topics in computer science, teaching each of them through a social, cultural, race-grounded approach. The program is officially considered a course in social sciences, and the topic of study focuses on the technical systems that shape important parts of our society.

In this case study, we take a close look at the Race, Abolition and AI Program to understand some of the ways it develops deep technological knowledge to empower young people to make sense of the world they live in, to develop a sense of agency and leadership in the face of systems not designed for them, and to reaffirm their identity and wellbeing.

About the Connected Wellbeing Initiative

The Connected Wellbeing Initiative brings together researchers, designers, educators, and funders to accelerate youth and community-powered innovations for fostering wellbeing in a digitally connected world. The Initiative’s Impact Studio supports early- to mid-stage innovations that model the core principles and approaches of connected wellbeing:

Principles
  • Young people are leaders and sources of strategies, as well as beneficiaries.
  • Caring relationships and communities are tapped as essential supports for wellbeing.
  • Solutions grow from youth identities, interests, lived experience, culture, and communities.
Approaches
  • Connecting to people who get you
  • Harnessing tech for equity and inclusion
  • Diversifying and amplifying youth voice

The Race, Abolition, and AI Program is one of 11 innovations in the Impact Studio that benefits from personalized advising, capacity building opportunities, and cross-sector connections to accelerate impact and build shared purpose. This case study highlights some of the innovative and meaningful ways that the Race, Abolition, and AI Program supports youth wellbeing in the digital age.

History and Context

Class Picture with Dr. Ruha Benjamin 

Since its founding in 2020, the Race, Abolition, and AI Program has been housed under the VIP Scholars program (“VIPS”) at UCLA. VIPS is a culturally responsive college access program that engages students from 10th to 12th grade in residential intensives during the summer and academic mentoring during the school year. To date, the program has been incubating as one component of the summertime experience, meeting with students for about 15 hours over the course of their summer intensives. In the future, the program plans to expand to new locations and contexts.

Dr. Tanksley, a scholar and educator, who identifies as a Black woman, leads the program. She describes it as a critical race tech program: it is focused on developing the sociological and computational skills to allow teens to understand how race and technology intersect so that they can be empowered to navigate and eventually redesign technologies in ways that uplift, rather than harm, people of color. The program focuses explicitly on Black identities, cultural experiences, and funds of knowledge, and the kinds of algorithmic anti-Blackness that scholar Dr. Ruha Benjamin calls “the new Jim Code.” (This is a term for the ways that technologies have replaced and automated the impacts of Jim Crow laws in our country’s recent past.)
“When I talk about racism with my students… it’s really about knowing, being able to name the phenomenon so that we can fight it. It’s really hard to fight something and win if you have no idea what the problem is. You’re just feeling the effects of it.”— Dr. Tiera Tanksley, scholar and educator

Where others may only see the emotional weight that comes along with talking to teens about racism, Dr. Tanksley also sees opportunity for transformation. “When I talk about racism with my students… it’s really about knowing, being able to name the phenomenon so that we can fight it. It’s really hard to fight something and win if you have no idea what the problem is. You’re just feeling the effects of it.” 

Student reflections from the end of the program speak to the impact of this approach:

  • “I had never thought about asking Chat GPT questions about race… It really showed how the whole color blindness was like, kind of coded into this.”
  • “[The Program] changed the way that I look at everything now… now, when I see things, I feel I’m more conscious than I was before. I’m not all the way conscious yet because I’m still learning, but I feel I’ve improved as a person and I’m able to see things that I didn’t see before like the [racist infrastructure of] the temperature scanner thing. If you had never taught me about that, I would have never thought anything of it.”
  • “The more you learn, the better you’re equipped to fight the issues that you’re facing. So if you’re starting at a young age, learning about the racial divide within social media, or the racial divide within technology and access to technology, then you’re better prepared and you’re prepared to make that change. You’re educated enough, and you can educate other people to help make that change.

One student further draws connections to STEM education writ large, and how a more critical approach could be transformative:

“I think a lot of people in society are taught like STEM is just supposed to be … the teacher tells you it, and you’re just supposed to believe that it’s all true, and then just say it back on the test…. But I think in this class there was a lot of open discussion and open interpretation. Also people got to share about their experiences with technology as well… I think [adding a critical lens] allows kids to be, culturally engaged and understand what they’re learning is actually important. Especially because a lot of people think that if they don’t do well in STEM, or they get a bad grade in STEM that they’re just not good at that subject when in reality it’s just like the way things are being taught, and they could be taught in a different way, or can be contributing to the class in a different way that’s just as effective.”

These stories reflect what Dr. Tanksley has envisioned since she started the program in the summer of 2020, during the protests in support of the global movement for Black lives following the murder of George Floyd. That summer, she began by focusing on the bare minimum: survival. Dr. Tanksley recalls the intensity of the moment: “What happens [to teens] when you’re home now all the time and you’re online all the time and you’re exposed to racism at increasingly high rates? You’re getting ‘Zoom bombed’ in your virtual classrooms, and then you’re seeing images of people being killed on your timelines, across all your platforms, and then your teachers are wanting to have Zoom conversations about it?” Her decision to start the program was an attempt to create “a wellness space [for teens] that helps them process racialized trauma as it manifests online and gives them some skills to subvert and resist these systems.” In the years since that particularly fraught summer, the program’s target outcomes for teens have expanded. Dr. Tanksley describes this change by invoking the title of a book authored by abolitionist educator Dr. Bettina Love: “We Want to Do More Than Survive.” And so today, the Race, Abolition and AI Program embraces a broad sense of youth wellbeing outcomes, emphasizing community, connection, leadership, and empowerment — using STEM learning as a means of reaching these goals.

Connected Wellbeing in Action

Students use If/Then statements (functions) to describe how racial logics encoded within school policies, pedagogies and practices create material, discursive and spiritual harm for Black students.  

In the vignettes that follow, we illustrate a few ways that the program brings connected wellbeing principles into practice.

Empowerment through STEM

The first lessons in the program build on what students already know, starting with the top three social media platforms they say they use and their experiences on them. Students share their reflections and eventually connect their personal experiences to findings from cutting edge scholarship in critical race technology studies (including scholars Ruha Benjamin, Virginia Eubanks and Safiya Umoja Noble). In doing this, what starts as a personal reflection zooms out to join a broader pattern of experience. TV and movies (Black Mirror, Ex Machina), hit songs (“Dirty Computer” by Janelle Monáe), and video games (Grand Theft Auto, NBA2K) add a pop culture lens to teens’ racialized experiences of technology too. And all of that is before the discussion pulls back the curtain on the business models that profit from the intersection of race, violence, and trauma.

“A thing that caught me by surprise learning in this course so far is the fact that Google actively profits off of the searches, clicks, and people’s fascinations with watching Black people die from police brutality. Algorithms sell searches per click are worth money,” one student reflects. Another shares, “The subject matter we covered, especially how crucial Black Death was to the internet and its economy: it was really shocking. I look forward to learning more about our relationship with technology and how race plays a part in that relationship because I don’t think I would ever get the opportunity to learn about such things ever again.”

Research Background: Sponsoring youth interests and identities

RAAI “sponsors,” or offers adult support for, youth interests and identities–a core design principle of connected learning. Youth are offered emotional support, as well as access to knowledge, resources, and opportunities. Research has documented the positive outcomes when youth are able to share knowledge and engage in creative production within affinity networks tied together by shared interests and common purpose. RAAI is tailored to the culture and backgrounds of Black youth—honoring their lived experience and tapping relatable popular media. Research has consistently documented gaps in representation for racial and ethnic minorities, as well as how this both representation and critical engagement matters for healthy and empowered identity development.

Close on the heels of these revelations is an alternative: technology doesn’t have to be this way. An important piece of the Race, Abolition, and AI Program is giving students the knowledge and ability to impact how technology is designed, making it more race-conscious or socially beneficial. For example, students researched Appolition, a tech tool that automated the collection of user’s spare change and pooled it together to fund the bail bonds of people of color. They also investigated BlackBird, a web browser created in the mid 2000s that employs race-conscious tech to produce more culturally responsive search results for Black internet users.

“Knowledge is power,” as the saying goes. And tackling complex issues head-on is empowering for teen participants in the program. As Dr. Tanksley reflects, “At the end of that first class, [participants] talked about how empowered they felt. That they could change their social media content and figure out the algorithm and the technical expertise that they gleaned from the class was really, really empowering.”

The class meets Moxie the socio-emotional support robot.

Moxie the robot

Algorithmic racism isn’t limited to web platforms. In 2023, the program expanded by introducing a new classmate to the program: a robot called Moxie.

Moxie is an AI-powered robot that is billed as a consumer product for 5-10 year old children. Marketing materials on the company’s website claim that “Moxie is the first robot capable of believable social interactions and emotional responsiveness, enabled by generative AI, natural language processing, and computer vision. With Moxie, you’re not just buying a social robot — you’re getting constantly updated play-based social emotional content.”

Students in Dr. Tanksley’s class decided to put Moxie to the test. Rather than being just a classmate during sessions, Moxie became a roommate too, living in the dorm during the summer intensive. The teens even made Moxie an Instagram account.

The teens took charge of learning about the robot’s benefits and limitations, and questioned foundational aspects of its design. Teens noticed that if a young user is interested in dinosaurs, Moxie will talk to the user about dinosaurs. But, as Dr. Tanksley recalls, the teens questioned “What if a Black kid is interested in Black hair? Or if they’re interested in Black history? When they asked Moxie about those types of topics, Moxie didn’t have answers, and regularly tried to switch their attention elsewhere… She would always say, ‘Let’s do something more fun. That sounds like a sad topic.’

Students conduct Google searches to uncover hidden racial logics that become encoded within search algorithms to create biased results. Here, students are researching “professional hairstyles” and “ghetto hairstyles.”

Then, they use Teachable Machines to “Retrain” Data in Race-Conscious and Intersectional Ways. Here a student creates a data set that reimagines what counts as “unprofessional” vs “professional”, making sure to include diverse representations of age, hair style and texture, skin color, etc. 

Put together, the lessons that teens were learning weren’t simply about developing new intellectual understanding: they were part of a multifaceted experience that interrogated and reimagined complex technologies in order to bolster students’ sense of self and wellbeing.

Then, they use Teachable Machines to “Retrain” Data in Race-Conscious and Intersectional Ways. Here a student creates a data set that reimagines what counts as “unprofessional” vs “professional”, making sure to include diverse representations of age, hair style and texture, skin color, etc.

In response, the teens came up with “Jordan,” an alternative concept for a consumer product. Jordan was a deliberately race-conscious and justice-oriented robot, modeled after Moxie but with new innovations. Jordan would talk to children about Black history and could provide instructions on how the user could do their hair, with various configurations depending on the user’s hair type. Jordan would give affirmations and lead breathing exercises to support the user’s emotional regulation. Jordan would have customizable skin tones and be able to play music by Black artists.

Research Background: Equity through co-design

Involving diverse users of technologies and programs in design is essential for correcting dominant biases and achieving equity. These commitments are particularly important when serving youth and other marginalized groups whose backgrounds differ from the perspectives of designers or program providers. Power dynamics and differences in background between adults and youth must also be addressed for solutions to effectively serve youth needs and achieve equity goals. Research has documented many cases where limited stakeholder input meant new technologies intended to broaden access resulted in more benefits to already privileged groups. RAAI reflects methods from Youth-adult co-design and  participatory action research that have resulted in solutions that reflect the genuine needs and culture of diverse groups of young people.

The teens were not just creating a wish list: they understood some of the technical implications of the features they were designing. Teens stipulated where the data for AI algorithms would be sourced, and that the computer vision models would need to adequately recognize Black faces. (The data sets used both to train AI and computer vision models are well-documented examples of algorithmic anti-Blackness.) The teens experimented with AI training models online and understood that their own interactions, too, would be a data source that influences how the AI bot would engage. In this process, teens developed technical know-how that blurred the boundary between the program’s official “social science” label and the “STEM” label that participants had spurned on the first day.

Put together, the lessons that teens were learning weren’t simply about developing new intellectual understanding: they were part of a multifaceted experience that interrogated and reimagined complex technologies in order to bolster students’ sense of self and wellbeing.

“What is AI?”

Researching the pedagogy while teaching it

The program is deliberate not just about what topics teen participants learn; it is deeply attentive to how they learn these topics too.

To understand more about this, we can turn to Dr. Tanksley’s research. In addition to designing and leading the program, Dr. Tanksley is also studying her practice and publishing about it. “I’m actually a researcher trying to understand pedagogy, critical literacy, critical algorithmic literacies, STEM equity. All of these things. I’m studying my own practice while I’m studying what youth are learning.”

Tanksley Introduces “Algorithms” as a set of instructions –  or racial logics – designed to solve a problem, and asks students to write up everyday “algorithms” that shape the lives and schooling experiences of Black youth.

Dr. Tanksley’s pedagogy embraces the cultural affinities she shares with her students, and she uses these commonalities to enrich the learning experience. Her research findings show that a significant part of students’ positive experiences is “because they had this really joyous, playful experience that they don’t feel is typical of STEM. One of the findings from the study is a theme about joy: joy as resistance and joy as a design principle.” But this finding is specific: it is “not just generally joyous — Black joy is centered. [Students are] like, ‘The way you speak in African American English, the stories you tell, like you make us feel like Black joy is important to learning.’’” As the program has evolved from students’ surviving to their thriving, the pedagogy that guides the program has remained consistently committed to uplifting the vibrancy and ingenuity of Black culture.

Research Background: Culturally sustaining pedagogy

RAAI employs a “culturally sustaining” pedagogical approach that grows from and sustains the cultural knowledge and lived experience of Black youth and their communities. Research has documented how the emphasis on community cultural wealth is essential for counteracting negative stereotypes about marginalized communities, and fostering a sense of belonging and safety. These approaches counteract deficit beliefs about Black culture and youth, and recognize broader systemic barriers and forms of oppression in the dominant culture and institutions. A growing body of research documents the effectiveness of these approaches in supporting positive ethnic identity, belonging, wellbeing, and self-efficacy for marginalized youth.

Dr. Tanksley is developing new language and models to describe this approach. What is paraphrased above in students’ words, Dr. Tanksley theorizes as “a critical race, abolitionist pedagogy” in her academic research. In one publication, she elaborates:

“In order to truly abolish anti-Blackness as it manifests within schools, STEM+CS [computer science] and digital technologies, CS education needs new pedagogical approaches that are situated in the everyday practices, cultural sensibilities and sociotechnical funds of knowledge of Black youth; that center Black joy, humanity and futurity as central design features of teaching, learning and designing tech; and that leverage the rich histories of Black resistance and radical intellectual thought that enable Black youth to ‘do more than survive’ techno-racial domination.”

As Dr. Tanksley points out, this approach has implications for STEM and computer science education — and it furthermore holds lessons about how to center both critical and joyous understandings of race  for the broader community coalescing around the principles of connected wellbeing.

Conclusion

Students use If/Then statements (functions) to describe how racial logics encoded within school policies, pedagogies and practices create material, discursive and spiritual harm for Black students.

As the Race, Abolition, and AI Program moves toward its fourth summer, it continues to evolve. Lessons adapt to emerging technologies in the field and to the changing interests of youth participants. And its evolution may involve expansions to new locations beyond the VIP Scholars program at UCLA too. Educators around the country have contacted Dr. Tanksley to express their interest in bringing the Program to a new context — whether that might mean borrowing curriculum models for existing programs, training teachers to implement similar pedagogy, or a developing a local franchise of a summer camp experience. Dr. Tanksley is in no rush to grow too quickly, and instead is committed to keeping the youth, their identities, and their wellbeing firmly at the center of whatever comes next.

This case study was produced in partnership with Race, Abolition, and AI and the Connected Learning Lab at UC Irvine, with the support of Brian Cross, research consultant; Mimi Ito, Connected Learning Lab Director at UC Irvine; and Krithika Jagannath, postdoctoral scholar at UC Berkeley’s Institute of Human Development. We also thank the communications and web team at the Connected Learning Alliance for their work on layout and design.