After another year of AI being mainstream in Australian education, what do those on the ground really think? We asked tutors from across Learnmate to reflect on what’s changed, and whether AI is a helpful tool, or a harmful shortcut.
AI in mainstream education
When ChatGPT exploded onto the scene in late 2022, schools and universities scrambled to adapt. By 2024, debates around its place in education were widespread. Now, with the 2025 school year drawing to a close, we asked a handful of tutors on Learnmate who work directly with students every week, to reflect on the real-world impact of AI in Australian classrooms.
What we heard was sobering, insightful, and thought-provoking.
Here's what they had to say.
“The State of Learning Is Already Appalling — AI Is Just the Latest Shortcut”
Nicholas C, who has worked in education for more than a decade, warns that the AI debate may be distracting us from a larger crisis:
“The current state of learning, attention, literacy and basic numeracy in students aged 10–16 is absolutely appalling — and it was well before AI emerged.”
He describes a generation of students struggling to focus, think independently, or complete basic academic tasks without assistance; a challenge he attributes to years of digital overstimulation from phones and screens.
For Nicholas, AI is simply another tool that reinforces shortcuts rather than skills:
“Some students can’t even interpret what the bot gives them — they lack the literacy or logic to understand the answer. How is that helping?”
He has implemented a full AI ban for students under Year 12, instead focusing on handwritten work and close supervision to rebuild independent thinking and core skills.
“Students No Longer Have to Think for Themselves — It’s Obvious When They Don’t”
Another tutor on Learnmate who supports students in Maths and Science (but asked not to be named) offered a blunt assessment:
“I absolutely hate AI. It is actively making our kids dumber.”
They report seeing clear signs when a student has relied on AI: language that doesn’t match their ability, wrong answers using unfamiliar methods, and most concerningly, a refusal to even attempt thinking through a problem.
“If kids are not taught to think critically, they never will.”
“It’s Faster — But Often Wrong, Incomplete, and Unreliable”
Cheryl G, an English tutor, is similarly critical. She cautions that AI platforms too often replicate the unreliability we try to steer students away from and while improving, in their current state they often lead students to error:
“We’ve told them Wikipedia isn’t reliable — and now we tell them to use AI, which draws from Wikipedia. It’s contradictory.”
Her bigger concern, though, is ethical: AI platforms rarely attribute sources, making them — in her view — a kind of plagiarism engine.
“Just because something exists doesn’t mean it should be used.”
“Let’s Face It — AI Is Here. The Question Is: Can We Teach Students to Use It Ethically?”
Magdalena F, an English tutor with over 6 years’ experience, takes a more balanced view.
“AI exists, and students will use it whether parents or teachers want them to or not. So the best thing we can do is teach them how it works and how to use it ethically.”
Magdalena has seen firsthand the risks of AI misuse: students failing to cite sources, over-relying on Copilot or ChatGPT without critical thinking, and blindly adopting American conventions (like Oxford commas) without understanding them.
But she’s also seen the benefits when AI is used as a revision tool — not a solution machine.
Positive Use Cases Magdalena Encourages:
-
Thesaurus assistant: Helping students rephrase their own sentences to improve clarity and vocabulary.
-
Punctuation awareness: Advanced punctuation from AI has sparked curiosity in some students, leading to deeper learning.
-
Generating practice prompts: Students can feed in their notes and receive custom essay questions or revision cues — which encourages deeper study.
“There’s Less Original Voice, and More Copying Without Realising It”
Carina C, an English and Maths Methods tutor, has noticed subtle shifts:
“Students are using very similar vocabulary choices. Their writing has less personal voice.”
She also highlights a more invisible risk: proofreading software and AI tools like Grammarly can silently ‘fix’ errors, giving students an inflated sense of their actual writing ability.
“When I had my student switch off spellcheck and write by hand, we finally saw the real areas we needed to work on.”
For Carina, AI isn't inherently bad — but without guided awareness, it leads to unearned confidence and surface-level improvements that mask deeper skill gaps.
Are We Building a Generation of Cognitive Borrowers?
Learnmate’s earlier blog explored the concept of cognitive debt — a term coined by MIT researchers to describe the long-term risks of outsourcing our thinking to machines.
As the tutors in this blog suggest, many students are veering in that direction or may already be deep in that debt.
Rather than building mental stamina, students are defaulting to AI before engaging their own brain and independent thought. The result? Homogenised writing. Surface-level understanding. Critical thinking left at the door.
As one tutor put it:
“AI is a tool — not a cheat code. As long as VCE exams remain paper-and-pen, students need to first understand how to do the skills themselves.”
Learnmate’s View: AI Should Boost Learning, Not Replace It
After another full year of teaching and tutoring alongside AI, one thing is increasingly clear to us: AI presents immense opportunity but equal risk if subject to unguided use or deliberate misuse. As we highlighted in our earlier blog, "AI, Cognitive Debt, and MIT’s Study on Student Learning", the challenge isn't AI itself, but how it's used.
At Learnmate, we see the great promise of AI in education in augmenting education and automating administrative tasks that don't engage teachers and students meaningfully. That is to say, when used the right way, AI can be a phenomenal unlock — both for students and educators.
For students
AI can be a game-changing study partner:
-
It can summarise long notes, create flashcards, or generate essay prompts — saving time while reinforcing knowledge.
-
It can serve as a 24/7 Socratic tutor: asking questions, checking understanding, offering examples, or triggering deeper thinking.
-
It can enhance vocabulary, support structured writing, and spark curiosity about advanced grammar or punctuation.
For teachers and tutors
AI can alleviate the burden of out-of-lesson hours:
-
Preparing lessons, generating differentiated practice questions, writing reports, and creating revision materials can now take minutes instead of hours.
-
This allows educators to reclaim their time and focus more on personalised teaching and student relationships.
But here's the catch: AI must not replace cognitive effort.
The calculator is an analogous disruptive technology that posed similar risk to student's ability to complete maths calculations and understand arithmetic. But we managed this transition through schools teaching students the fundamentals of mathematics and predominantly by ensuring students could do calculations manually before enabling the use of the calculator.
We taught and continue to teach students not only the principles behind the calculations but also how to prompt the calculator itself. That is to say, we ensure understanding and set the framework for when a calculator can be used before allowing it. We teach students how to be good at using the calculator but not to be reliant on it.
The use of AI in classrooms should be no different.
Final Thoughts: What Now?
What’s clear from speaking to tutors and reviewing emerging data is this: AI can be a double-edged sword in the classroom. Used properly, it can support learning. Used improperly, it can sabotage it.
Learnmate’s view remains cautiously optimistic. The onus is on us to ensure AI is used properly and that it leads to improved educational outcomes, and does not accelerate the decline. It is there to support learning, not a substitute for it.
The challenge now — for educators, parents, and students alike — is to ensure we achieve this end.
So what should we do?
-
Acknowledge it: Ignoring it or blanket banning it won’t stop students from using it.
-
Teach ethical and thoughtful use: Help students become critical users who know how and when to rely on AI, and set guardrails for its use. This might include restricting its use to reinforcing concepts or content already learned or using AI purely for ideation but not implementation (ie. helping to come up with oral presentation ideas, or essay topics).
- Reconsider classroom formats: Consider less digitised, unplugged learning environments. This might include:
- For older students, using the Socratic/cold call method and randomly selecting students to answer questions on their feet as opposed to asking for volunteers.
- Move away from online or take-home assignments and back to in-class work under supervision.
- Teachers and tutors must ensure students understand concepts, topics and content before allowing deferral to AI.
AI is here, and it is here to stay. Whether that's in the classroom or the workforce, the incumbency is on all of us to ensure it is used to lift learning, not lower it.
If you are a student, parent or teacher/tutor and have your own thoughts you would like to share, please reach out at hello@learnmate.com.au. We'd love to hear it.



