Welcome to my thesis page, which will serve as my mlog (multimedia log) as I navigate and meander through my senior thesis for Media Arts + Practice (MA+P). MA+P is a division within the University of Southern California's School of Cinematic Arts. For our thesis, we’re prompted to create an art research-based project, drawing from everything we’ve learned over the past three years. We are encouraged to reflect, feel, and create anything.
Throughout our time in MA+P, we take classes to hone technical skills across various mediums—creative coding, 3D modeling, film, sound, and more. We also dive into theory, exploring topics like race, class, and gender in media, neoliberalism, and cultural production. Much of our work is rooted in practice: creating projects that tell stories.
Since freshman year, I dreaded the idea of creating something all-encompassing and super awesome. I thought that was what a thesis had to be—the most amazing thing ever. Then junior year arrived, and I watched the seniors sprint toward their theses, facing problems that now feel all too familiar: indecision and perfectionism. I decided that if anything, I wanted my thesis to feel like a marathon, not a sprint.
I know myself—I tend to turn future goals into weighty obstacles that feel too large to overcome. So, I told myself my thesis should have fun vibes.
I initially envisioned my thesis as a light collage reflecting moments that make me feel most present: summer days in the grass, hanging out with friends, being with family. My vision was to create a visual aesthetic love child of Olafur Eliasson and Sarah Sze.
However, as the semester began, I realized I wanted to shift my focus to love. I wanted to explore love in all its grandness and vagueness. For now, my thesis remains broad and open-ended, but I know that by the end, I want to narrow it into something cohesive. I envision showcasing my work at the annual senior showcase in May 2025, where I’ll have a 10x10 space for an installation. My hope is for people to leave that space thinking deeply about love—bringing love to the forefront of their minds. I want them to talk about love more.
By the end of this semester, I need to present my progress at a critique session with faculty members. I’m both nervous and excited—this will be an important checkpoint. You got this, Katie.
As I continue to develop my thesis, I’ll update this mlog with my progress, setbacks, and thoughts—a sort of thesis diary. Academic writing can feel exhausting, so I’ll embrace this space as a place for stream-of-consciousness reflections… albeit proofread.
This journey is a marathon, and I hope to stay consistent and persistent. I’m inspired by my family, friends, professors, mentors, artists, and the dear people on the internet. Documenting my thoughts, knowing they’ll change and evolve, feels both exciting and grounding.
Feel free to peruse and meander with me. 😊
Why AI
I began my thesis thinking it would center on creating something super aesthetic, working with light and color. I imagined the research as an enjoyable, collaborative practice—conversations over tea as a form of exploration. I had no concrete idea where it would lead beyond an installation, and for the most part, I still don’t. In the past, I worked with AI to create films and smaller experiments. However, I hadn’t deeply engaged with AI in a way that interrogated datasets or the underlying technology. Ethics have always been a concern in my work: Where does the data come from? Whose work is being scraped in the process? What about biases in training models? And the energy required to maintain servers and generate outputs?
I am part of the AI for Media and Storytelling Studio under Professor Holly Willis. In our discussions on ethics, we noted that many ethical considerations about AI parallel broader aspects of the world, such as comparing the energy consumption of generating a chat response to that of loading a livestream, or the rules governing remixing.
I thought deeply about my issues with AI and reflected on Sarah Ciston’s work. During my freshman year of college, I participated in the Creative Code Collective with Sarah, where we created a Python project using datasets. Back then, I didn’t fully grasp the importance of datasets, but now I increasingly recognize how they underpin everything. If knowledge is power, then the ways we understand and collect that knowledge shape the systems of power in place. Much of the data used for AI is obscured from users: Where is this data coming from? Who is collecting it? Who is labeling it? All these factors influence the biases and perspectives of a model.
What if my soft data collection—like thoughtful conversations about love—became the foundation for some kind of AI? If we want our technologies to reflect our values, how can we feed them data that originates from those values? Thoughtful conversations about love. Thoughtful conversations about love by Gen Z. What are the ethics of using conversation as data—with consent, of course? What is my role as the moderator? What are my intentions? How is moderation itself a performance? What about the emotional labor involved?
I’m also grappling with practical questions: How do I even begin to create a language model, even a small one? The technical aspects—structured vs. unstructured datasets, categorization, labeling, file types—leave me feeling overwhelmed. I recently read Sarah’s A Critical Field Guide for Working with Machine Learning Datasets, which provided clarity but also led to more questions. Sarah critiques the “promise of machine learning to convert data into meaningful, monetizable information.” Can my installation, and the way I use AI, challenge this capitalist framework?
What if the AI were slow and inefficient? What if it created more problems or questions instead of offering solutions? I don’t want my project to produce a digital clone or a replacement for a human. Maybe it could act in playful opposition to such ideas or offer thoughtful assistance instead. I’m inspired by the works of Lauren Lee McCarthy and Stephanie Dinkins, who use AI to create interactive installations and performances—such as entities built from oral histories or moderators for online calls.
How can I use AI to explore love and space-making? What does my project contribute to the larger conversation about AI? What does it mean for my generation as technology integrates ever deeper into our lives? What about the current state of the world—feelings, meaning? These questions blur together for me. I continue to ponder and read.
This past summer, I spent a lot of time thinking about love—not just romantic love, but what it means to live within what bell hooks calls a love ethic.
I reflected on my parents and their complicated relationship. I thought about myself and my own relationships. For three weeks, I was in Alaska participating in a friend’s residency. It felt like living in a small-town Hallmark movie, and I observed the relationships around me. It was fascinating how quickly I became embedded in other people’s lives, contrasting with how my own life had often been defined by observing my parents’ dynamic.
How do our relationships reflect our values in the way we love? Everything we do stems from a place of love. Without love, nothing would exist. These were some of the insights people shared as I began practicing my soft data collection. I started with the question, “What is love?”—a question I thought was both cliché and revolutionary. I hadn’t realized I could simply ask people what love means to them. Through questions, answers, and conversations, I discovered the experiences that shaped their love ethics.
Over the summer and into my senior fall semester (which is now), I found myself asking this question everywhere—on trains with friends, during office hours with professors, and in awkward lulls during conversations. I loved talking about love.
This led me to think more intentionally about these conversations: How can we create space for such a grand topic, giving it the time and attention it deserves? How can we reflect and ponder together—with friends, strangers, anyone? I feel lucky to hear these stories and have these conversations, but what about creating spaces where others can share and connect, too?
If my thesis is rooted in research, why couldn’t dinners or gatherings act as a form of research? I embraced the term soft data. Initially, I hosted casual dinners in my apartment, but I realized this wasn’t the most effective way to collect soft data. To be more intentional, I reframed these gatherings as tea nights: a dedicated two hours to talk about love. I invited small groups of people—some who knew each other and some who didn’t. It was mostly based on availability and vibes~
Here’s how these tea nights unfolded:
On the day of, I’d text participants to confirm the time and set expectations. If someone could only stay for part of the two hours, I’d politely suggest they join a future session instead. Each group was made up of four to seven people. When guests arrived at my apartment, I had a camera for documentation, some Trader Joe’s snacks, animal crackers, and tea (of course). I played some Khruangbin in the background.
Everyone sat around the dinner table and introduced themselves. Then, we began with an icebreaker: I handed out cards to determine pairs. I instructed everyone, “Now look into each other’s eyes for six minutes.” While maintaining eye contact, they were to draw each other without speaking. Hehehe.
If the group had an odd number of people, I sat out. If it was even, I participated too. It was fun!
Once the ice was broken, I guided the group into discussions about presence, well-being, and, ultimately, love.
These tea nights became a way to thoughtfully engage with the topic of love, transforming casual conversations into something more intentional and meaningful. Reflecting On More Research
Recently, I met with Kim Zumpfe and Carrie Chen to talk about my thesis and how to move forward more intentionally. Two main takeaways stood out: I need to better define the scope of my research demographic and really focus on honing in on a specific group. I’ve been reflecting on who I’ve been engaging with so far—not just my friends or USC students interested in talking about love, but something more defined. I initially thought about grouping people by age, but that still feels too broad. I’m now exploring Gen Z individuals with multicultural backgrounds, especially children of immigrants, as a possible focus. Carrie also mentioned scoping the research question itself, but I kind of like how broad it is right now because it gives people room to share what love means to them in their own way. That said, it might evolve as the project progresses.
I’ve also been thinking more about how the installation will look and brainstorming step-by-step scenarios for how people might interact with the AI. This has led to deeper research into AI learning models as I try to decide between supervised and unsupervised approaches. Visiting the All Watched Over by Machines of Loving Grace exhibit at Redcat gave me some inspiration, but I still need to work through exactly how to design or integrate the AI in a way that fits the project’s intentions.
One of my ongoing struggles has been figuring out how to create a language model for the thesis. To tackle this, I’ve been diving into more research and talking to other artists. At the Flux Festival, I had some really insightful conversations about AI. My discussion with Raashad Newsome stood out—he told me about creating an AI based on the teachings of Black revolutionaries. When I shared my plans to create a model, he encouraged me to look into wrappers or consider using existing tools like ChatGPT. He pointed out how hard it can be to develop something from scratch and reminded me that many early models were trained on incredible datasets. This advice aligns with some of the ideas I’ve been exploring in Sarah Ciston’s work, especially A Critical Field Guide for Working with Machine Learning Datasets.
Ciston’s writing has been super helpful in reshaping how I think about data and machine learning. For example, she talks about the “promise of machine learning to convert data into meaningful, monetizable information,” while also questioning this framing. She draws on philosopher Sabina Leonelli’s idea of data as a “relational category,” which means that what counts as data depends on who uses it, how, and for what purpose. Leonelli describes data as “‘any product of research activities [...] that is collected, stored, and disseminated in order to be used as evidence for knowledge claims’ [8]. This definition reframes data as contingent on the people who create, use, and interact with them in context.” This definition really hit me—it reframes data as something deeply connected to the people and contexts that shape it. These insights feel especially relevant to my goal of creating an installation that humanizes AI and centers on the relational aspects of data and technology.
As I continue to research, navigate and talk to more people, the more I learn and piece together my thesis. Below, I have compiled a complete list of references and research:
With my family, I spent my summer thinking about love.
With bell hooks, we talked All About Love.
With Ellie Schmidt, we learned about love in Alaska.
With Lauren Lee McCarthy, we delved into her art practice using installation and performance.
With Olafur Eliasson, we talked about openness and beauty.
With DJ Johnson, we considered dinner parties as a serious and thoughtful form of research.
With Zeynep Abes, we talked about the significance and confidence of holding space for love as a topic.
With my friends, we discussed presence, wellbeing, and love over late night tea and animal crackers.
With Alice Yuan Zhang, we discussed the breadth of love, and looking for scope.
WIth Johans, we thought about the specificity of an ethnography.
With Stephanie Dinkins, we considered her art practice critically using AI.
With Carrie Chen, we defined scope and the practice of creating “happening”.
WIth Kim Zumpfe, we talked about feedback loops as a throughline and ways to represent that interactively.
With Raashad Newsome, we considered AI wrappers as a technical solution.
With Sarah Ciston, we shared technical resources on AI and datasets and considered ethnographic contexts.