As generative artificial intelligence becomes increasingly prevalent in education, the debate over its role in academic writing intensifies. College professors are grappling with the implications of this technology and its impact on student learning.
For English professor Dan Cryer, utilizing AI for college essay writing feels akin to applying a forklift in a gym setting.
“If all we needed was to move weights, then that would be fantastic,” says Cryer, who instructs at Johnson County Community College near Kansas City, Kansas.
“However, we need to develop our muscles, and the writing process is crucial for students to strengthen those skills.”
Cryer notes that AI has introduced a new challenge for educators like him: assessing whether a student’s work genuinely reflects their own efforts. This issue is exacerbated by the fact that his community college, along with many others nationwide, offers students access to AI tools.
“These tools create an unfair burden for students, as they must navigate the line between responsible and irresponsible use of AI,” Cryer explains.
In the three years since ChatGPT’s introduction, generative AI has woven itself into daily life, compelling both professors and students to explore its appropriate applications, particularly in humanities courses.
Recent data indicates that many students are embracing AI: a survey by Inside Higher Ed and the Generation Lab from last July revealed approximately 85% of undergraduates utilize AI for academic tasks such as brainstorming ideas, outlining papers, and studying for exams. About 19% reported using AI for entire essays.
Many students who experimented with AI expressed mixed feelings, acknowledging its utility while also feeling it hindered deep thinking.
Aysa Tarana, a recent graduate, recalls her first year at the University of Minnesota Twin Cities coinciding with ChatGPT’s launch. Initially, she turned to the AI for minor tasks like topic suggestions.
However, Tarana eventually stopped using it, feeling that it led her to “outsource my thinking, which was uncomfortable.”
This concern aligns with Cryer’s worries.
After dedicating a sabbatical to studying generative AI, Cryer reached a conclusion of his own: he advocates for minimal use of AI tools in educational settings.
“These tools seem primarily designed to alleviate the mental effort required,” he observes.
Cryer now focuses more on convincing students of the importance of engaging deeply in the writing process. He emphasizes that the aim of education is about the journey, not just the outcome. “We don’t need more college essays; we need students to engage in writing research papers to enhance their critical thinking, construct coherent arguments, and discern good sources from bad,” he states.
If students rely on AI to complete their assignments, Cryer warns, they may miss out on the educational experience they signed up for.
A Professor Who Embraces Generative AI
In Charlotte, N.C., Leslie Clement has come to regard generative AI as a valuable tool that can bolster student learning.
“We encourage our students to use it responsibly because we know they will utilize these tools,” explains Clement, a professor of English, Spanish, and African Studies at the historically Black Johnson C. Smith University.
Clement permits students to use AI for crafting outlines, gathering feedback on concepts, and comparing diverse sources of information.
She also co-developed a course titled “African Diaspora and AI,” exploring how AI affects individuals of African descent worldwide, including the hazardous mining of cobalt in the Democratic Republic of Congo, a critical component in AI technologies. The course delves into AI’s potential benefits and celebrates the contributions of Black researchers and scientists.
“We’re exploring Afrofuturism and how students can leverage these tools to envision their futures,” Clement asserts.
Her primary goal remains to cultivate critical, ethical, and inclusive thinking, encouraging students to apply these principles when engaging with AI technologies.
“I want my students to not only use these tools responsibly but also to interrogate their implications,” Clement emphasizes.
The AI Study Buddy
Several hours northeast, in Durham, N.C., pre-med student Anjali Tatini has discovered her own effective ways to utilize AI. As a double major in global health and neuroscience, she finds AI tools invaluable in grasping complex subjects.
For instance, during a particularly challenging biology course last semester, Tatini turned to Gemini, Google’s AI chatbot, seeking clarification.
“I’d say, ‘Here’s the concept—can you explain it?'” Tatini remembers. “If the explanation was too advanced, I could ask it to simplify, which was incredibly beneficial.”
In lessons such as chemistry, she employs AI to create practice problems for exam preparation; in marketing, she uses it for brainstorming; in statistics, she applies it to assist her with coding for data analyses.
Having an on-demand tutor is a game changer for Tatini, especially given her busy schedule. “I have jobs, classes, and clubs. I can’t always attend office hours,” she states. “It’s refreshing to have a resource that is accessible on my timeline and responds as a person might.”
However, Tatini draws the line at having AI draft her assignments for her. While she employs AI for outlining and organizing her ideas, the actual writing remains her responsibility.
“If I’m putting something out there, I want to stand behind it. I would never let AI write it because it wouldn’t represent my style,” she explains.
‘Your Output is Like a Fingerprint’
Nearby, in Chapel Hill, junior Hannah Elder, a pre-law student at the University of North Carolina, takes pride in her written expressions.
“I passionately believe in forming and articulating your own thoughts,” she asserts.
Elder, who takes a mix of courses including public policy and philosophy, utilizes generative AI mainly for proofreading and aligning her work with course rubrics.
However, she firmly believes in crafting her ideas without AI’s assistance. Learning to articulate her beliefs through writing is, in her view, one of the most valuable aspects of her educational journey. She worries that excessive reliance on AI could hinder students from developing independent critical thinking.
“I still use paper for my notes because I believe what you write is like a fingerprint in the world. There’s a risk of that authenticity being lost,” Elder remarks.
Nonetheless, she doesn’t advocate for a complete ban on AI.
“We cannot ignore that AI will be a facet of college life,” she says.
Elder hopes educators will weave AI training into the curriculum, empowering students to discern between beneficial and detrimental uses of the technology.
“If teachers incorporate it responsibly in academic settings,” she suggests, “it will be viewed less as a shortcut and more as a reality—an opportunity to learn how to use it effectively.”
This reporting was supported by a grant from the Tarbell Center for AI Journalism and the Omidyar Network’s Reporters in Residence program.