Students come to terms with how to use AI in the classroom
In classrooms across East Carolina University’s campuses, faculty and students alike are grappling with how to use artificial intelligence (AI) tools in the classroom — how to balance new ways of teaching and learning with the imperatives of ethical behavior.
Dr. Will Banks, a professor of English who directs ECU’s University Writing Program, believes that AI will disrupt education, but it’s not all doom and gloom. In classes where he teaches composition, the amount of revision — or rethinking and reworking the ideas that give life to writing — is so great that if students begin from an AI generated prompt, the end result will have changed so radically it won’t be the same output.
Besides, Banks said, students have always received help with their schoolwork outside of the classroom.
“Roommates, boyfriends and girlfriends — the frat house news file. There has always been a space for borrowing work,” Banks said. “Nothing students are going to write in a first-year writing course means anything in and of itself. The purpose isn’t to create a beautiful text, it’s to learn about how to ask questions, how to do inquiry.”
Banks stressed that he isn’t going to police anyone; if they want to shortchange themselves, that is their choice, but in recent classes he’s taught, none of the writing that students have turned in seems to have been composed by generative AI.
“That may be how I construct the prompts. I require students to use texts from the class. They can take it all out and plug it into AI and do that work, but for a 200-word response, the juice isn’t worth the squeeze,” Banks said.
Dr. Anna Froula, who teaches film in the English department, wondered how reasonable it is to ask teachers in classrooms from middle schools to the university level, who are already overburdened with assessments, the psychological health of their students and never-ending reports, to take on the challenge of effectively teaching themselves how to integrate AI into their courses.
“The job that I’m doing now has so little to do with the job I trained for,” Froula said.
Dr. Desiree Dighton teaches English at ECU with a focus on how technology can be used to extend social capital to marginalized groups. When ChatGPT and other AI technologies were emerging into public awareness a few years ago, her initial response was to disallow their use by students. Her position has changed in recent years, as AI’s usefulness is becoming better understood.
“This year, my policy is that they can use it with permission, they just need communicate to me what they’re doing and realize that it may involve extra steps on their part in terms of being transparent,” Dighton said. “It also comes with the responsibility — I have to teach them and model what that could mean in the classroom. My graduate students are largely working adults and many of them are already using ChatGPT casually in their workplace.”
Compassion is a big part of the solution, Dighton said, both for students and faculty. She knows the multitude of pressures students are under and, like Banks, sees AI as a tool that might be able to help in the composition process. The faculty leading those students also need support, Dighton said, with discipline-specific professional development opportunities so teachers aren’t forced to teach themselves on the fly.
“Do we care if students use ChatGPT period, or do we want them to use it assistive to get past writers block or to see if the forum post that they originally wrote is going to meet the criteria in our rubric?” Dighton said.
Dr. Michelle Eble, a professor of English who focuses on technical and professional communication, wonders if AI will help educators and students alike come to the realization that if writing is done properly, there are no single authors of a text. Each paper has a primary writer, of course, but also editors, revision partners and for professional communications, ghost writers. What is the dividing line between academic integrity — plagiarism — and using available tools to create the best possible end result?
“It’s not as clear cut as I think we like to believe,” Eble said.
What Do Students Think?
Dakota Hamm, a junior from Mayodan, north of Greensboro, shares Dighton’s concern that the energy required to support AI computational power is bad for the environment, but he also recognizes that machine learning tools are here to stay, and can be helpful for learning if used ethically.
Note taking and summarizing texts is one way that Hamm uses tools like ChatGPT, along with getting a jumpstart on the organization of his ideas and to shake creative blocks.
Hamm said he wouldn’t consider cutting and pasting schoolwork from AI tools because it would be obvious he had done so, because “it talks very robotically and loses all kinds of personality.” For a recent assignment for a mass media class, he asked ChatGPT to give him some ideas for a unique, competitive business. The responses were helpful, but only as idea germination rather than a polished submission for the assignment.
Abby Trzepacz moved to Greenville from New Hampshire for school. She didn’t start using AI tools until last year because she felt guilty, unsure if having ChatGPT to summarize overly complicated academic articles or generate ideas would be seen as cheating. But she has since become more comfortable wading through the ambiguities of the tool.
Trzepacz sees things changing for the better. Where her professors once outright banned AI use, most now are more comfortable with students exploring the technologies, if the students are open with how they are using AI. Her peers are also getting more comfortable with using AI.
“It wasn’t until recently that I, and a lot of my friends, are also using it and it’s actually helpful to help us, to evolve ideas,” Trzepacz said. “As long as you’re not using it to cheat and get off easy, I think it’s totally OK to use.”
Garrett Moore, Dighton’s teaching assistant, has significant experience with AI that he’s learned from his course work in digital media, specifically game development and 3D animation. In writing classes, he primarily uses AI tools to double check grammar and sentence structure — not unlike many students — but his involvement with AI tools extends past polishing the final draft of an essay.
The Greensboro native is involved in serious game development with the College of Nursing. He has used AI tools to refine programming and implementing steps in the process.
“There are a lot of efficiencies that can be utilized with it,” Moore said. He uses AI tools for refining programming scripts and quality-assurance testing in game development.
Moore said he checks the syllabus at the start of each course to see what the individual instructor’s rules are, but he feels that most of the professors he’s studied under are considering the ethical implementation of AI technologies.
“They are considering how programs need to adapt to the usage and implementation of AI and consider the career goals of their students,” Moore said.