ECU faculty work to understand how to teach writing in the age of AI

When we try to predict the future, sometimes we are way off base. Flying cars, recreational space flight (well, that’s not such a reach anymore) and world peace all seemed right around the corner in the middle of the 20th century.

A woman with shoulder length brown hair and a floral-print top poses in front of a large stadium seating class with students seated at desks talking and working at their computers.

Dr. Anna Froula, a professor of film studies in ECU’s English department, pauses during her class, English 1500: Taylor’s Version, in October.

One book – 1964’s “How Things Change (Childcraft: The How and Why Library, Vol. 6)” – went so far as to suggest that students of the future, who get stuck while studying and “need some answers” might turn to The Answer Machine. Users of this ultramodern device need only type requests into it and would be instantly given answers to questions and shown photos and videos that enrich the machine’s response.

“Someday, you might have an answer machine that could do all of those things,” the book promised.

That someday is now, and the answer machines in our pockets can do much, much more.

Research has shown the promise of artificial intelligence (AI) for helping second language learners of English to elevate their more advanced writing skills (PDF) like “vocabulary, expressions, grammar, and sentence structures.” A 2023 ACT Research study found that about half of high school students surveyed said they elected to not use AI because it couldn’t yet be trusted (PDF), and 90% said they didn’t use AI for help with admissions essays because the AI tools didn’t “reflect students’ skills, abilities and unique writing styles.”

Students at the university level, and those soon to be, don’t seem to trust the technology quite yet, but that doesn’t mean it isn’t being used. That doesn’t seem to worry East Carolina University writing experts.

“I’m still trying to figure things out, but what I’m not doing is overreacting,” said Dr. Will Banks, a professor of English who directs ECU’s University Writing Program. Banks has taught writing for two decades and has seen trends come and go, as well as the incorporation of new technologies.

Banks said generative artificial intelligence is just another in a long line of panics about literacy in American history. Putting erasers on the end of pencils threatened to allow students to revise their writing, calculators would rob students of the ability to do math and typewriters would end penmanship.

While most of the past fears haven’t materialized, that doesn’t mean that there won’t be hiccups along the way with the incorporation of AI into learning, said ECU faculty members.

Dr. Anna Froula teaches film in ECU’s English department, where classes are oftentimes writing intensive. One of her first encounters with students using generative AI for a school project was not auspicious — the AI tool hallucinated a reference for an academic journal that she had been editing.

“That caught my attention. I could look back at my records and see that the essay didn’t exist, so that was a really, really unhappy conversation,” Froula said. What’s worse, Froula said, another student’s attempt at using AI to write a paper for a film class completely made up a new plot and characters for a well-known movie.

Those first interactions spurred Froula to attend classes and presentations about how faculty might start to approach using AI, or at least deal with student usage. The first time she asked students to use AI in the classroom, they didn’t like the result. Another class was encouraged that an AI summary of the films they were studying reminded them of specific plot points or details in the narrative.

Froula said she feels lucky leading film studies classes because, at least thus far, AI doesn’t seem able to handle the concepts or “clunky” vocabulary that she asks her students to consider.

“If we can’t do what AI cannot do then we are basically irrelevant,” Froula said.

Froula has plans for her next special topics seminar, which includes readings from researchers who focus on the technical, ethical and ecological issues that arise from the use of AI. She will ask students to consider how those concepts shape movies like the “Terminator” franchise and “Her.”

A New Tool, An Old Process

A man in a dark suit jacket and striped, purple tie smiles for a portrait photograph.

Dr. Will Banks, a professor of English, serves as the director of the University Writing Program.

Dr. Michelle Eble, a professor of English who focuses on technical and professional communication, has always incorporated technology into her teaching — first with email and the early days of learning management systems like Canvas and Blackboard — before harnessing computer servers in the mid-2000s to teach students computer-based communication.

The communicative possibility that websites she and her students built 20 years ago have shifted to social media, and now generative AI, but the result remains the same — innovative tools for people to use to communicate with others.

“AI is another technology that is going to influence the writing classroom, the teaching of it, the practice of it. So, figuring out what the affordances and constraints are is necessary,” Eble said. “How can we use that ethically and responsibly?”

Eble and her colleagues are currently exploring ways to revisit having a server in the English department, but this time to explore how a scaled down version of a large language model — the technology at the root of generative AI — can be used by ECU students for technical and professional communication.

Eble has taught online, asynchronously, since she first arrived at ECU in 2003. She’s watched as a nascent technology went from being a novelty that most faculty didn’t understand to an indispensable tool that the university couldn’t function without during the COVID-19 pandemic. She sees the same potential in generative AI.

“We’ve done this in the past, right? What can we learn? How can we be curious with our students?” Eble wondered.

Teachers, Eble said, needn’t fear AI, but should learn how to work with the technology to benefit students, and to understand the affordances and constraints of the technology for student success.

“It feels like this new thing that’s going to take over our lives. If you’re going to incorporate it, what’s the learning outcome?” said Eble, stressing that AI is no different than using other technologies to expose students to new ideas they might use after graduation.

Eble believes that there is no real way to enforce a “no AI” policy in the classroom. She’s heard of some instructors forcing students to write in class with a pen and paper, but “what learning outcome is that teaching them?”

Banks believes that most student are overwhelmed by the idea that using AI is illicit, at least when talking to their instructors. But he also believes that when teachers talk to students about appropriate ways to use AI in their writing process it opens opportunities for growth with a technology that isn’t going anywhere — and future employers will expect graduates to know how to use in their work life.

Students often feel like they are being buried under an avalanche of reading and writing in ways they don’t have experience with, Banks said, and the requirements levied upon students with learning differences can be even more paralyzing. Why shouldn’t students use a tool to help them take their first few steps with a writing assignment by offering an outline or a writing prompt? Especially if the process is as much about revising drafts of writing assignments as developing the first draft.

“In writing studies, we have spent 40 years of research and scholarship helping students to get the first draft. All that this field knows, AI wipes all of that out,” Banks said. “Most of us who teach writing really want to be teaching critical thinking and how to engage a text, audience, purpose and context. How do you craft language to have an impact and not just be a hoop you jump through?”

Banks gave an example of how disruptive technology can be leveraged to solve real problems, which he believes will occur with generative AI, as students and teachers become more comfortable with its use.

In the early days of Wikipedia, most faculty had significant anxiety about students using it, Banks said. As a new faculty member, he sat in on a Native American literature class taught by now-retired professor Ellen Arnold and turned to the internet to learn more about the real people in the literature he was reading.

“I found myself going to the internet every five pages trying to figure out who these people were. What I noticed was none of the authors and events were on Wikipedia because it was so white and so male,” Banks said.

Arnold took him up on an offer to put her students to work to plug holes where information about Indigenous authors was missing on Wikipedia. Banks believes AI will ultimately be used in similar ways to increase access to information, especially for and about marginalized groups.

A woman with salt and pepper hair, and wearing glasses and a lavender top, smiles for a formal portrait.

Dr. Michelle Eble is a professor of English and the coordinator of the rhetoric, writing and professional communication doctoral program.

When they leave ECU, Banks feels students should be comfortable working in AI, but wonders if overreliance on AI tools might deprive members of a future workforce who use a “shortcut all the time. Will they be able, when the situation requires it, to do the thing they need to do?”

Dr. Desiree Dighton, an assistant professor in the English department, said a major concern for her is building student confidence, that they feel that they have agency using new generative AI tools.

“We have to dispel them of that and show them the critical thinking and writing skills that they may have,” Dighton said. “And teach them to lean into their judgment. They don’t have to take all of the AI’s advice. If they tried to enact every suggestion, they would be exhausted.”

Along with Froula, Dighton is concerned with the environmental harm that comes from the amount of energy required to run AI server farms.

“We also have to teach them a wider perspective — it’s not just about what a technology does to benefit them, but the tool’s effect on the planet and unfair labor practices in other countries. I think that a lot of our undergraduates really care about the environment, and they may choose to use it less,” she said.

Banks is thankful that the university is hosting professional development opportunities to demonstrate ways that AI has been used by other faculty as he has limited understanding of how AI functions. The technology is still in its relative infancy and clumsy with the responses it provides.

Banks said peers across the nation are developing grassroots solutions to fill in the gaps in teaching strategies for writing in academic spaces with AI.

Similarly, Dighton believes that in the same way that faculty should show compassion to students when learning to navigate AI, the same grace should be offered to other faculty members. Teachers who haven’t yet learned how to use AI might not be in the best position to model responsible use of AI in the classroom.

“We need to embrace the responsibility that we can’t just stick our heads in the sand and not expect students to misuse it because they haven’t been shown or given firm boundaries about how we expect them to use AI in the classroom,” Dighton said.


More Stories