Research Integrity Oversight in Canada: A Postplagiarism Perspective

April 11, 2026

The Canadian Panel on Responsible Conduct of Research (PRCR) is proposing substantive changes to Canada’s research integrity framework, and the public comment window closes April 17, 2026. If you care about research ethics in this country, you have days left to weigh in.

I want to flag a few things about these proposed changes and why they matter to those of us working in postplagiarism research.

The most consequential proposal is the removal of any statute of limitations on allegations of research misconduct. As attorney Minal Caron told Retraction Watch, the existing policy is silent on this question. The proposed language would require institutions to review allegations regardless of how much time has passed since the work was published, which would be a significant shift. It’s also a long-overdue one. Complainants often delay coming forward out of fear of retaliation, and a policy that turns away allegations on procedural grounds protects no one except those who benefit from institutional inaction.

The PRCR also proposes to require institutions to hold respondents accountable even after they have left, and to accept anonymous allegations and allegations already circulating in the public domain as grounds for review. These aren’t radical ideas. They’re basic conditions for a credible oversight system.

I’ve written and spoken at length about how postplagiarism requires us to rethink accountability in an age of AI. But accountability without enforcement infrastructure is a philosophical position, not a policy. These proposed changes represent a concrete attempt to build infrastructure. They will not resolve every tension in Canadian research oversight, and the critics quoted in the article are right to flag gaps, particularly around the vagueness of institutional RCR education requirements.

One of the scholars quoted in the Retraction Watch piece is Gengyan Tang, a PhD candidate and a member of our Postplagiarism Research Lab, who studies research integrity policy. His observation that the proposed language around RCR education is too ambiguous is precise and fair. Institutions can host an ‘Academic Integrity Week’ and check a compliance box without delivering anything substantive. Policies that do not specify how education is to be delivered or evaluated leave too much room for performative compliance.

The Pruitt case, cited in the article as a catalyst for some of this reform momentum, is worth naming directly. Jonathan Pruitt was found to have fabricated and falsified data. The case exposed how the 2011 framework’s absence of relevant procedures allowed institutions to deflect rather than investigate. Requiring institutions to act regardless of elapsed time or an individual’s current affiliation is a direct response to that failure.

Postplagiarism, as a framework, asks us to think past the categories we have inherited. The academic integrity arms race that I have discuss in my research applies just as much to research misconduct oversight as it does to student cheating. Detection tools, policies, and procedures are only as good as the institutional will to apply them rigorously. These proposed changes push toward compulsion rather than discretion, which warrants close attention.

The comment period is open until April 17, 2026. If you work in research integrity, this is your chance: read the proposed revisions and submit feedback.

__________

Reposted from: Research Integrity Oversight in Canada: A Postplagiarism Perspective – https://postplagiarism.com/2026/04/11/research-integrity-oversight-in-canada-a-postplagiarism-perspective/


Interfacing with the Future: Reflections on the National Day of Learning 2026

April 1, 2026

On March 28, 2026, I had the pleasure of joining educators from across Canada for the National Day of Learning, hosted by Let’s Talk Science. This one-day, nation-wide professional learning event brought together K–12 teachers, post-secondary educators, and policy leaders to explore some of the most pressing issues shaping education today, with artificial intelligence high on the agenda.

I was invited to deliver a session titled “Interfacing with the Future: Wearable AI and Academic Integrity for K–12 and Higher Ed.” What follows are a few reflections and key ideas from that conversation, hosted by Dr. Alec Couros.

Moving into the Postplagiarism Era

One of the central ideas framing my talk is postplagiarism. In this reality, artificial intelligence is no longer an external tool that students occasionally use, but rather, it is embedded into everyday life and learning.

Students are already engaging with AI in ways that challenge traditional notions of authorship, originality, and academic work. The question is no longer if students will use AI, but how.

This shift requires a corresponding change in how we think about academic integrity. Detection and surveillance, long relied upon as primary strategies, are no longer sufficient. Instead, we must rethink how we design learning environments that foster integrity from the ground up.

From Tools to Wearables: How AI is Advancing

A key focus of my presentation was the rapid evolution from AI tools to AI wearables — particularly smart glasses and other forms of cosmetically invisible interfaces. The talk was based, in part, on our recent article in Canadian Perspectives on Academic Integrity

Wearable technologies integrate AI directly into our physical experience of the world. Rather than pulling out a device, users can access real-time information, transcription, and prompts seamlessly through their field of vision.

This shift introduces both opportunities and tensions:

  • Cognitive offloading: Learners can reduce mental load by accessing information instantly. (Phill Dawson has done some great work on cognitive offloading that I recommend reading.)
  • Enhanced presence: Wearables allow users to maintain eye contact and engagement without device distraction.
  • Efficiency gains: Tasks such as note-taking or translation can be automated in real time.

At the same time, these benefits come with real challenges including information overload, privacy concerns, and technical limitations. More importantly for educators, they fundamentally disrupt assumptions about what it means to “know” something independently.

New Technology ≠ Cheating

One of the most important messages I emphasized is this: new technology does not automatically equal academic misconduct.

If a tool is permitted, then its use is not cheating. The real issue lies in unauthorized use or misuse in ways that create unfair advantage. 

We must also remain attentive to equity and accessibility. Some wearable technologies may be used as accommodations, making it essential that our integrity policies are inclusive and nuanced rather than rigid and punitive.

Designing for Integrity (Not Surveillance)

Rather than doubling down on detection, I encourage educators to shift their focus toward designing for integrity.

This means:

  • Prioritizing assessment validity: If an AI system can complete a task without genuine understanding, then the task itself needs to be rethought.
  • Moving beyond “gotcha” approaches: Surveillance-based strategies erode trust and are increasingly ineffective.
  • Supporting diverse learners: Students bring different technological access, needs, and experiences. Our designs must reflect that.
  • Building a culture of integrity: Integrity is not enforced; it is cultivated through meaningful learning experiences.

Bridging K–12 and Post-Secondary Education

Another key theme was the gap between K–12 and post-secondary expectations.

In K–12 environments, students are often encouraged to explore technology as part of their learning. In contrast, post-secondary institutions frequently operate under the assumption that students already understand complex academic integrity rules.

As AI continues to evolve, this gap becomes more pronounced. We need stronger alignment across educational sectors to ensure that students are supported, rather than being set up for failure, as they transition between systems. (Myke Healy has a great paper on the topic of GenAI in the K-12 context that is worth reading.) 

Looking Ahead

If there is one takeaway from this experience, it is this: wearable AI is not a future scenario. It is already here.

As educators, we are being called to respond not with fear, but with thoughtful, research-informed approaches. The challenge is not simply to manage technology, but to reimagine teaching, learning, and assessment in ways that remain meaningful in an AI-integrated world.

Events like the National Day of Learning remind me of the power of community. Bringing educators together to share ideas, ask difficult questions, and explore new possibilities is essential as we navigate this rapidly changing landscape.

Thank you to Let’s Talk Science and to Dr. Alec Couros for the opportunity to be part of this important conversation, and to all the educators who continue to lead with curiosity, courage, and care.

______________

Share this post: Interfacing with the Future: Reflections on the National Day of Learning 2026 –  https://drsaraheaton.com/2026/04/01/interfacing-with-the-future-reflections-on-the-national-day-of-learning-2026/

Sarah Elaine Eaton, PhD, is a Professor and Research Chair in the Werklund School of Education at the University of Calgary, Canada. Opinions are my own and do not represent those of my employer.