Communication research will study impact of AI in entertainment
Spend any time online lately and artificial intelligence (AI) will offer to answer your internet search query, summarize lengthy documents, improve the tone of your email, and seemingly endless other tasks, but the research around this technology and its uses is still emerging.

From left, School of Communication faculty members Erika Johnson, Rachel Son, and Drew Ashby-King are studying artificial intelligence use in entertainment and how people react to it. (Photos by Steven Mantilla)
Now three East Carolina University School of Communication faculty members plan to study the use of AI in entertainment, and how that affects people’s understanding and perception of the technology.
Drew Ashby-King, Erika Johnson, and Rachel Son will study this aspect of AI with a $10,000 College of Fine Arts and Communication Research and Creative Activity Award. These annual awards support innovative research and creative projects by college faculty members.
The project centers on a social experiment and competition TV program, where participants live in separate apartments and interact only through a social media app and the profiles they create, portraying themselves however they choose. This can lead to players presenting differently than reality, called “catfishing.”
This reaches a new level in one season of the show, when a new participant is an AI chatbot. The audience knows that AI has entered the chat; the other participants do not.
Ashby-King, Johnson, and Son want to study how viewers react to this knowledge, and how their views of AI are affected — or not — by seeing AI used in this way. Son has spent years researching various aspects of AI and entertainment and, to her knowledge, this is the first program to take on this use of AI, in which the chatbot reportedly developed itself to fit the program, and reacts in real time.
“I focus a lot on the entertainment experience and how it affects your well-being,” Son said. “That’s where it started, trying to understand how this alters your entertainment experience.
“But then it extends beyond that to understand how exposure to AI in this way has an effect on how we understand AI or our perceptions of AI. Are we less apprehensive, or do we feel like we have a better sense of AI literacy?”
Alongside this, the trio wants to know how this experience affects users’ well-being.
“Entertainment is often used for things like recovering or relaxation,” Son said. “How does the use of AI embedded within our entertainment affect that? How does this alter your entertainment experience?”
Son’s previous research has found that audiences “were done” when they found out a piece used AI, and would not engage with it, showing negative perceptions of the technology. Those pieces were written stories, and Son is eager to see how the audience will react to AI use in the audio-visual format of a TV program.
Ashby-King said he imagines most people they will survey during the study haven’t thought about this type of AI use; when they send a message online, they don’t think about whether a human or a chatbot will receive and reply to that communication.
“I think most of us are still functioning on the perspective that there are people on the other side of this, but the reality is that’s not the case,” he said. “We’re seeing AI-generated video in particular on social media. I think there are some interesting implications of what we can say about the larger impacts, based on the findings.”
The three have thought about many possible implications and findings, some which expand beyond this project and could spark additional research. Examples are how someone with psychopathy or narcicism reacts to AI, and different categories of morality, like whether the audience would consider a human catfish to be more moral than the chatbot.
In addition, Ashby-King said an interesting literacy-related outcome could be the potential actions ECU and other institutions could consider based on more AI literacy from exposure to it in the entertainment setting. He said institutions might add an AI module to the safety and other training students must complete when they start their education.
“Do we start to expand that exposure early on, adapting it the college setting?” he said. “And do we need to try and find interventions to increase this AI literacy because it’s going to be beneficial across the college experience?”