Educators across Canada are experimenting with new ways to integrate AI into learning.GETTY IMAGES
Artificial intelligence has gone from novelty to necessity in Canadian classrooms, and professors are learning that ignoring it is no longer an option.
Research from KPMG in Canada shows that nearly 60 per cent of Canadian students now use AI for schoolwork, up from just over half a year earlier. Globally, the numbers are even starker: an empirical study led by Brock University professors Rahul Kumar and Robert McGray found that more than 72 per cent of postsecondary students used the technology in 2022-23, with self-reported usage climbing to more than 94 per cent this year.
That surge is forcing professors to make quick choices about whether to resist, regulate or embrace the technology. Some worry that tools such as ChatGPT will weaken students’ ability to think critically and creatively. Others argue that ignoring AI risks leaves graduates unprepared for the workplace. And across campuses, faculty are experimenting with new ways to integrate AI into teaching and assessment.
For Prof. Kumar, the scale of adoption is impossible to ignore. Policy, he argues, will always lag behind practice, which means professors must learn to guide students through responsible use rather than ban the tools outright.
“Policy development and guidelines can never keep up with the speed with which AI is evolving and transforming practices,” he says. “There’s tremendous freedom that is both enabling and disheartening to some.”
At the University of Toronto, Prof. Sophia Bello has seen that tension firsthand. Her students were initially reluctant to touch AI tools at all, worried about academic integrity. But, when the university introduced an institutional enterprise version of Microsoft’s AI tool, Copilot, she used that moment to encourage experimentation.
“It’s not about using AI to do the work for you,” she says. “It’s all about the reflection of the writing process and teaching them how to use these tools responsibly.”
Her exercises push students to refine and personalize AI-generated drafts and to spot mistakes. This means learning to ask the right questions.
“What does AI bring to the table with this first-drafted prompt response? What do you do to personalize that paragraph? What do you do to correct that grammar using a tool?” she says. The exercise doubles as training in prompt writing and in recognizing hallucinations, which is when AI presents false information as fact.
That mirrors Prof. Kumar’s concern about critical thinking. He encourages students to use AI openly but insists they must “put their own signature on an AI tool’s output, as opposed to just submitting what the AI has produced.” As he puts it: “AI lowers the effort of variety, but education’s task is to raise the standard of thinking.”
His advice to colleagues: run your work through AI, then redesign them.
“Pose assignments you have through AI and see what kind of output it produces. Try to change them in such a way that what students have to submit is not exactly what AI produces,” he says. “We have to ensure that students are learning and the grade I’m awarding and the degrees the university is conferring are not to AI but to the people.”
At the University of Guelph, tax and accounting professor Sonia Dhaliwal has already seen how that adaptation works in practice. Researching cases used to require combing through databases; now AI tools do the heavy lifting.
“It basically turns components of the Income Tax Act into bite-sized portions that students can understand,” she says.
Some educators hesitate to bring such tools into the classroom. Ms. Dhaliwal takes the opposite approach.
“I know a lot of educators who are hesitant and haven’t embraced the use of AI,” she says. “But it is our job as educators to get our students job-ready. And part of that in today’s world does entail getting them up to speed with AI.”
For her, workplace readiness is inseparable from the critical-thinking skills Prof. Bello and Prof. Kumar emphasize. Prof. Bello draws on the STRIVE model, developed at the University of Calgary, to frame AI as part of student-centred learning.
“If we’re putting students at the centre of their learning, learning how to use these tools is going to benefit them in the workplace,” she says. “I believe that whatever we do as instructors starts with learning it ourselves and having fun with these tools, and that can resonate into what our students will learn.”
Prof. Kumar has adopted a personal approach to assessment. To evaluate his graduate students’ depth of knowledge, he schedules unscripted 15-minute one-on-one conversations, tied loosely to the course material. These discussions test their ability to analyze, debate and challenge.
“We must prepare students for a world where the question is not, ‘Can you use AI?’ But rather, ‘Can I trust your thinking without using AI?’” he says.
Resources are emerging to support faculty in this shift: an AI assessment repository compiled by a postdoctoral fellow at McMaster University, teaching examples from the University of Toronto’s Centre for Teaching Support and Innovation and frameworks such as the aforementioned STRIVE. But the professors agree that guidelines will never be enough on their own.
“Students are already using AI, so now it’s about how to get them to use it more ethically, responsibly, such that it enhances rather than diminishes their education,” Prof. Kumar says.