Students will soon be allowed to use generative AI at UGDSB
A PA day on April 24 will focus on AI literacy, guidlines and student-use
GUELPH – Certain artificial intelligence (AI) tools are set to be approved for widespread student use at the Upper Grand District School Board (UGDSB) later this month.
That includes Google Gemini, Notebook LM and Microsoft Copilot – generative AI tools powered by large language models.
The board is expected to approve student use of the tools following a professional activity day on April 24, during which teachers will learn about AI literacy and the board’s AI guidelines.
Increasing AI literacy is key, board officials say, to teach students to think critically about using it in safe and ethical ways to add value instead of replacing thinking.
“Once you learn how to read, then you can read anything. Once you learn about AI – how it works, the pitfalls, the potentials, the implications for human rights … you can use any tool effectively,” said superintendent Pat Hamilton.
The UGDSB has been slower than some boards to release AI guidelines, which he said is in part to avoid repeating societal mistakes with social media.
“We didn’t spend a lot of time thinking about the impact,” or the critical thinking needed to use social media effectively, he said.
The board is planning to host parent engagement evenings about AI before the end of the school year, Hamilton said, and officials are working to be transparent about its use at the board.
Concerns about AI
Hamilton’s biggest concern is that AI "will become a weapon instead of a tool” and misinformation and human rights issues will be perpetuated, which he said is already happening in some ways.
“But it’s too late to turn back, so we better figure out how to use it and understand it,” he added.
Online learning and AI literacy implementation principal Keith Coutu is most concerned about students misusing or over-relying on AI. It’s essential not to take AI at face value, to understand its inherent biases and inaccuracies, and to analyze and fact-check responses, he said.
Large language model responses are just averages based on available data, Coutu said, so if the data contains bias, the AI will, too.
AI also mirrors the user’s inherent bias, said Centre Wellington District High School English teacher and department head Alex Kempenaar.
“It seems more likely to push you towards sources and things that confirm ideas you already hold rather than challenging those ideas,” said Kempenaar.
Hamilton said human rights are central in the board’s AI approach.
That includes privacy and security, which is why only Google’s generative AI tools have been approved. Board officials aren’t confident other companies meet the same level of protection.
“If we are going to make something available to students and educators across the system, we need to be confident that the privacy and cybersecurity things are in place,” Hamilton said.
UGDSB officials are working on creating their own large language model, or chatbot, said UGDSB program lead Adam Barnard.
That should ensure “it’s drawing from stuff that’s safe and has good practices embedded in there,” he added.
Potential benefits
Coutu believes AI will “revolutionize” how curriculum and materials are accessed and how learning is demonstrated.
AI can increase accessibility with speech-to-text and translation programs, and teachers can use it to personalize materials.
Coutu uses AI to help his son, who is in Grade 2 and has ADHD and autism, learn to read.
Generic phonics reading materials can be pretty dry, so Coutu pops a few prompts into Google Gemini to create text that meets his son’s interests and reading level.
AI can also be used to change the reading level of a particular text, Kempenaar explained, and learning activities can be diversified to suit a range of students.
Plus, teachers can use AI to streamline research on best practices, he added.
AI can also help with simple, time-consuming tasks such as formatting documents, added Barnard.
And it can help teachers analyze student struggles and suggest secondary work to support them, Hamilton said.
Coutu said he thinks there’s a misconception that teachers are using AI to grade assignments or to make decisions, which he stressed is not the case.
“Decisions about student programming, IEPs or placements are never made by AI,” he said. “Professional judgement must always be the final arbiter, with a lens of equity and human rights."
The student experience
Kempenaar said in many ways, students are more aware than teachers about how AI impacts them, and are often able to point out limitations and identify misinformation.
While students are interested in AI, they also appreciate tech-free teaching, Barnard said. “They don’t see it as something that would replace a teacher.”
“It will never replace humans – it can’t – and that’s obvious when you use it,” Hamilton added.
Students’ concerns often focus on environmental impacts, such as how much water AI uses, Barnard said.
“But we have industries that have been using water for hundreds of years, like concrete and steel,” he said. “And there’s too often clickbait out there right now that [AI] is using up all our water, when that actually isn’t true.”
The board recently brought in an expert to talk to Grade 7 and 8 students – Kevin Matsui from the Centre for Advanced Research of Ethical Use of Artificial Intelligence at the University of Guelph.
Barnard said Matsui challenged misinformation and outlined AI’s environmental impacts as well as how it can be used to mitigate climate change and support sustainability.
“There are two sides to this, it’s not a doom-and-gloom thing,” Barnard said. “We’ve tried to shy away from that fear-based learning and look at it with more of a hopeful approach.”
Classroom use
AI in UGDSB classrooms isn’t new. Students and teachers have already been using Google Read&Write, OrbitNote, Google Translate, Immersive Reader, Lexia and Knowledgebook to increase accessibility and learning, Coutu said.
And teachers have been encouraged to use board-approved AI tools to diversify teaching and better serve student needs, Kempenaar said.
He noted he's heard a range of responses from teachers about AI, ranging from cautious enthusiasm to a belief it doesn’t belong in classrooms at all.
In the earlier days of AI, he said, “there was a lot of fear about what that meant for the human side of things.” But it seems many teachers, including himself, are “coming around to the idea that AI is a tool like any other,” with benefits and limitations, he added.
Kempenaar said teachers are working to wrap their heads around how students will be required to use AI after high school, so they can adequately prepare youth for future career paths.
AI use is up to individual teachers’ discretion, Barnard said.
If teachers are concerned a student may have used AI inappropriately, they ask about the process, Coutu said.
AI detection tools are not allowed to be used, Coutu noted, “As they are notoriously unreliable and biased, particularly against multilingual or neurodivergent learners.”
Kempenaar said after a few years of exposure to AI-generated text, it has become “clear to teachers … when something is computer-generated, and when it’s missing those human elements of critical thinking.”
He explained he’s teaching students that while AI is helpful in some ways, such as increasing efficiency, “it can’t replace human elements that make some of the things we produce really valuable.”
If a student uses AI dishonestly, it is treated as an academic integrity issue, the same as if a student claims another person’s words are their own, said Kempenaar.
But the board’s approach to AI isn’t focused on policing academic dishonesty, it’s on teaching ethical and responsible use, he said.
“Critical thinking has always been a goal in education, and AI makes that even more necessary,” Hamilton said.
He added it’s a reminder to “always question what you read.”