In an age where technology permeates every aspect of our lives, the conversation around artificial intelligence (AI) and academic integrity has become increasingly pertinent. A notable piece from New York magazine entitled “Everyone Is Cheating Their Way Through College” sparked debates in mid-2025 about the implications of AI in education. The alarming trend of students using AI to cheat rather than learn reflects deeper issues within our academic systems.
However, the issue isn’t merely that students are utilizing AI; it’s about how they do so. I engage with AI regularly—it’s an essential skill for today’s learners. The crux of the matter revolves around students claiming AI-generated work as their own. Ultimately, the discussion on academic integrity and AI isn’t a technological one; it’s fundamentally ethical.
I am an avid supporter of generative artificial intelligence and leverage it for various tasks—be it workouts, recipes, outlining articles, or creating coding scripts to transform data into visuals. The range of application is vast. When employed judiciously, AI enhances productivity; when misused, it magnifies mistakes. The debates surrounding academic integrity compel us to confront our true selves and our intentions.
This discourse often falls into counterproductive camps. One faction equates AI to a calculator, while another views it as a threat to human cognition. Both oversimplify the complexities at play. The “just a calculator” narrative overlooks how calculators and similar tools relieve us of numerous quantitative challenges. Simply knowing how to operate a tool doesn’t equate to understanding its significance. Conversely, the perspective that AI marks the end of thought disregards its potential as a valuable aid. If AI serves as a collaborator, that’s beneficial; if it functions as a mere substitute, that’s detrimental.
At the heart of the issue lies not the tool, but the user. AI can be wielded effectively or harmfully, just like any other instrument. In the hands of skilled craftsmen, an axe is used to build shelters, while, in the hands of a villain, it becomes a weapon. This distinction is essential.
In 2023, amidst our initial interactions with AI, I penned an article aimed at the skeptical student who questioned, “when am I ever gonna use this?” regarding the humanities and non-vocational studies. My response remains: “every time you make a decision.” The choices we make are reflections of our identities, shaped by the influences around us. Engaging with history, philosophy, literature, economics, and the liberal arts is about enriching our lives and honing our judgment, allowing us to navigate critical decisions with wisdom.
This art of judgment can be poorly practiced in our era, where it’s all too simple to outsource our thoughts to AI models like ChatGPT and Gemini. To illustrate, consider the iconic film Aliens. If you haven’t watched it, I urge you to do so. If you have, reflect on the climax when Sigourney Weaver’s character, Ellen Ripley, dons a P-5000 power loader suit to confront the alien queen. This suit amplifies her strength, enabling her to accomplish what would otherwise be impossible.
Regrettably, many students approach AI similarly to using Ripley’s power loader suit at the gym. Yes, while you might manage to “lift” 5000 pounds with the suit, believing it enhances your inherent strength is a deceptive misrepresentation. Submitting work largely created by AI doesn’t cultivate your intellectual muscles; it inflates numbers while your actual skills atrophy.
Occasionally, utilizing AI resembles having a spotter during squats or bench presses. I find value in AI as a guide, suggesting my next exercise. Sadly, far too many students lean on AI as a crutch, allowing it to carry the load for them.
Tools like ChatGPT, Gemini, Grok, and Claude should liberate our time for higher-level tasks rather than conceal our inadequacies. Technology has greatly enhanced my productivity; I dictated the draft of this essay on my phone using wireless earbuds, revising it with tools like Gemini and Grammarly. There’s a clear distinction between that process and submitting AI-generated text; using dictation and AI to compose and refine is akin to utilizing Ripley’s power loader to efficiently move objects. In contrast, passing off AI-generated content as one’s own is like pretending to have a rigorous workout while relying solely on the suit.
My gratitude extends to ChatGPT, Gemini, Grammarly Pro, and GPTZero.me for their editorial support.