On March 28, 2026, I had the pleasure of joining educators from across Canada for the National Day of Learning, hosted by Let’s Talk Science. This one-day, nation-wide professional learning event brought together K–12 teachers, post-secondary educators, and policy leaders to explore some of the most pressing issues shaping education today, with artificial intelligence high on the agenda.
I was invited to deliver a session titled “Interfacing with the Future: Wearable AI and Academic Integrity for K–12 and Higher Ed.” What follows are a few reflections and key ideas from that conversation, hosted by Dr. Alec Couros.
Moving into the Postplagiarism Era
One of the central ideas framing my talk is postplagiarism. In this reality, artificial intelligence is no longer an external tool that students occasionally use, but rather, it is embedded into everyday life and learning.
Students are already engaging with AI in ways that challenge traditional notions of authorship, originality, and academic work. The question is no longer if students will use AI, but how.
This shift requires a corresponding change in how we think about academic integrity. Detection and surveillance, long relied upon as primary strategies, are no longer sufficient. Instead, we must rethink how we design learning environments that foster integrity from the ground up.
From Tools to Wearables: How AI is Advancing
A key focus of my presentation was the rapid evolution from AI tools to AI wearables — particularly smart glasses and other forms of cosmetically invisible interfaces. The talk was based, in part, on our recent article in Canadian Perspectives on Academic Integrity.
Wearable technologies integrate AI directly into our physical experience of the world. Rather than pulling out a device, users can access real-time information, transcription, and prompts seamlessly through their field of vision.
This shift introduces both opportunities and tensions:
- Cognitive offloading: Learners can reduce mental load by accessing information instantly. (Phill Dawson has done some great work on cognitive offloading that I recommend reading.)
- Enhanced presence: Wearables allow users to maintain eye contact and engagement without device distraction.
- Efficiency gains: Tasks such as note-taking or translation can be automated in real time.
At the same time, these benefits come with real challenges including information overload, privacy concerns, and technical limitations. More importantly for educators, they fundamentally disrupt assumptions about what it means to “know” something independently.
New Technology ≠ Cheating
One of the most important messages I emphasized is this: new technology does not automatically equal academic misconduct.
If a tool is permitted, then its use is not cheating. The real issue lies in unauthorized use or misuse in ways that create unfair advantage.
We must also remain attentive to equity and accessibility. Some wearable technologies may be used as accommodations, making it essential that our integrity policies are inclusive and nuanced rather than rigid and punitive.
Designing for Integrity (Not Surveillance)
Rather than doubling down on detection, I encourage educators to shift their focus toward designing for integrity.
This means:
- Prioritizing assessment validity: If an AI system can complete a task without genuine understanding, then the task itself needs to be rethought.
- Moving beyond “gotcha” approaches: Surveillance-based strategies erode trust and are increasingly ineffective.
- Supporting diverse learners: Students bring different technological access, needs, and experiences. Our designs must reflect that.
- Building a culture of integrity: Integrity is not enforced; it is cultivated through meaningful learning experiences.
Bridging K–12 and Post-Secondary Education
Another key theme was the gap between K–12 and post-secondary expectations.
In K–12 environments, students are often encouraged to explore technology as part of their learning. In contrast, post-secondary institutions frequently operate under the assumption that students already understand complex academic integrity rules.
As AI continues to evolve, this gap becomes more pronounced. We need stronger alignment across educational sectors to ensure that students are supported, rather than being set up for failure, as they transition between systems. (Myke Healy has a great paper on the topic of GenAI in the K-12 context that is worth reading.)
Looking Ahead
If there is one takeaway from this experience, it is this: wearable AI is not a future scenario. It is already here.
As educators, we are being called to respond not with fear, but with thoughtful, research-informed approaches. The challenge is not simply to manage technology, but to reimagine teaching, learning, and assessment in ways that remain meaningful in an AI-integrated world.
Events like the National Day of Learning remind me of the power of community. Bringing educators together to share ideas, ask difficult questions, and explore new possibilities is essential as we navigate this rapidly changing landscape.
Thank you to Let’s Talk Science and to Dr. Alec Couros for the opportunity to be part of this important conversation, and to all the educators who continue to lead with curiosity, courage, and care.
______________
Share this post: Interfacing with the Future: Reflections on the National Day of Learning 2026 – https://drsaraheaton.com/2026/04/01/interfacing-with-the-future-reflections-on-the-national-day-of-learning-2026/
Sarah Elaine Eaton, PhD, is a Professor and Research Chair in the Werklund School of Education at the University of Calgary, Canada. Opinions are my own and do not represent those of my employer.
Posted by Sarah Elaine Eaton, Ph.D. 

You must be logged in to post a comment.