Artificial intelligence (AI) is becoming a prominent element in the educational landscape of Australia.
By 2025, nearly 80% of university students indicated they had utilized AI in their academic endeavors. In comparison, figures from abroad are even more staggering; a recent UK survey revealed that 94% of undergraduates engaged with AI for assessed coursework.
This has sparked considerable concern regarding students’ potential to use AI for dishonest practices during their assessments. However, a new report authored with colleague Leslie Loble suggests that the true risk may go much deeper.
Research indicates that relying on AI can compromise the effort needed for sustainable, profound learning. This phenomenon, referred to as “cognitive offloading,” is particularly perilous for younger students who are still establishing foundational knowledge and skills.
The ‘performance paradox’
Our report introduces the concept of the “performance paradox.” This phenomenon occurs when students experience improved short-term performance on tasks with AI assistance, yet their long-term retention and understanding suffer.
An illustrative case comes from a 2025 randomized experiment involving high school students in Turkey who used an AI assistant for tutoring. While these students performed better on classroom tasks involving math problems, their comprehension plummeted when the AI was no longer available during assessments.
This evidence suggests that although AI may enhance immediate results, it simultaneously erodes the durable knowledge that is essential for genuine education. Consequently, students may overestimate their understanding, as AI creates an illusion of competence.
The simplicity of AI
Generative AI can produce clear, well-articulated responses, leading students to believe that deep mental engagement is unnecessary. Research shows this can create a disincentive for students to engage in planning, monitoring, and revising their work, as the AI handles those aspects.
This dynamic fosters a cycle wherein the convenience of AI-generated answers diminishes a student’s actual knowledge base, increasing their dependency on the tool while reducing their ability to assess its accuracy over time.
Critical thinking is not merely a generic skill; it is deeply connected to knowledge.
For instance, without a solid understanding of the various participants and perspectives in World War II, a student may struggle to critically evaluate a response about the conflict’s history (e.g., questioning biases or factual inaccuracies).
What steps can we take?
To address these challenges, universities and educators must shift their perspective on AI—from viewing it as an “answer oracle” to utilizing it as a collaborative tool in thinking and learning. There are two primary approaches to achieve this.
-
Leverage AI to offload non-essential tasks, such as grammar checks or citation formatting. This allows students to focus on deeper learning without becoming overly reliant on the AI for analytical thought.
-
Employ AI as a “cognitive mirror.” Instead of giving answers outright, the AI can pose clarifying questions, encouraging students to elaborate on their arguments and thereby fostering lasting understanding. For instance, if a student presents a vague claim in an essay, the AI could prompt them to specify their core assumptions.
Crucially, the development of AI tools should prioritize enhancing the capabilities of teachers rather than solely focusing on students’ immediate performance. Despite AI’s power, humans learn best through interaction with other humans.
By providing AI tools that empower experienced educators to enhance their effectiveness, we can ensure that technology supports student learning. For example, AI could analyze student performance data in real time to identify those individuals or small groups needing the most immediate human intervention.
The ultimate goal
Educational systems must guide students to understand and embrace the notion that genuine learning is a gradual process that requires effort. If AI becomes a substitute for the challenges of learning, there is a potential risk of diminishing cognitive skills.
The objective is not to shield students from AI but to equip them to coexist and thrive alongside it in their future careers.