Embracing AI as a Teaching Tool: Practical Approaches for the Post-plagiarism Classroom

March 23, 2025

Artificial intelligence (AI) has moved from a futuristic concept to an everyday reality. Rather than viewing AI tools like ChatGPT as threats to academic integrity, forward-thinking educators are discovering their potential as powerful teaching instruments. Here’s how you can meaningfully incorporate AI into your classroom while promoting critical thinking and ethical technology use.

Making AI Visible in the Learning Process

One of the most effective approaches to teaching with AI is to bring it into the open. When we demystify these tools, students develop a more nuanced understanding of the tools’ capabilities and limitations.

Start by dedicating class time to explore AI tools together. You might begin with a demonstration of how ChatGPT or similar tools respond to different types of prompts. Ask students to compare the quality of responses when the tool is asked to:

  • Summarize factual information
  • Analyze a complex concept
  • Solve a problem in your discipline
A teaching tip infographic titled "Postplagiarism Teaching Tip by Sarah Elaine Eaton: Make AI Visible in the Learning Process." The infographic features a central image of a thinking face emoji, with three connected bubbles highlighting different aspects of AI integration in learning:

Summarize Factual Information (blue): Encourages understanding of basic facts and data handling, represented by an icon of a document with a magnifying glass.

Analyze Complex Concepts (green): Develops critical thinking and deep analysis skills, represented by an icon of a puzzle piece.

Solve Discipline-Specific Problems (orange): Enhances problem-solving skills in specific subjects, represented by an icon of tools (wrench and screwdriver).
In the bottom right corner, there’s a Creative Commons license (CC BY-NC) icon.

Have students identify where the AI excels and where it falls short. Hands-on experience that is supervised by an educator helps students understand that while AI can be impressive and  capable, it has clear boundaries and weaknesses.

From AI Drafts to Critical Analysis

AI tools can quickly generate content that serves as a starting point for deeper learning. Here is a step-by-step approach for using AI-generated drafts as teaching material:

  1. Assignment Preparation: Choose a topic relevant to your course and generate a draft response using an AI tool such as ChatGPT.
  2. Collaborative Analysis: Share the AI-generated draft with students and facilitate a discussion about its strengths and weaknesses. Prompt students with questions such as:
    • What perspectives are missing from this response?
    • How could the structure be improved?
    • What claims require additional evidence?
    • How might we make this content more engaging or relevant?

The idea is to bring students into conversations about AI, to build their critical thinking and also have them puzzle through the strengths and weaknesses of current AI tools.

  • Revision Workshop: Have students work individually or in groups to revised an AI draft into a more nuanced, complete response. This process teaches students that the value lies not in generating initial content (which AI can do) but in refining, expanding, and critically evaluating information (which requires human judgment).
  • Reflection: Ask students to document what they learned through the revision process. What gaps did they identify in the AI’s understanding? How did their human perspective enhance the work? Building in meta-cognitive awareness is one of the skills that assessment experts such as Bearman and Luckin (2020) emphasize in their work.

This approach shifts the educational focus from content creation to content evaluation and refinement—skills that will remain valuable regardless of technological advancement.

Teaching Fact-Checking Through Deliberate Errors

AI systems often present information confidently, even when that information is incorrect or fabricated. This characteristic makes AI-generated content perfect for teaching fact-checking skills.

Try this classroom activity:

  1. Generate Content with Errors: Use an AI tool to create content in your subject area, either by requesting information you know contains errors or by asking about obscure topics where the AI might fabricate details.
  2. Fact-Finding Mission: Provide this content to students with the explicit instruction to identify potential errors and verify information. You might structure this as:
    • Individual verification of specific claims
    • Small group investigation with different sections assigned to each group
    • A whole-class collaborative fact-checking document
  3. Source Evaluation: Have students document not just whether information is correct, but how they determined its accuracy. This reinforces the importance of consulting authoritative sources and cross-referencing information.
  4. Meta-Discussion: Use this opportunity to discuss why AI systems make these kinds of errors. Topics might include:
  • How large language models are trained
  • The concept of ‘hallucination’ in AI
  • The difference between pattern recognition and understanding
  • Why AI might present incorrect information with high confidence

These activities teach students not just to be skeptical of AI outputs but to develop systematic approaches to information verification—an essential skill in our information-saturated world.

Case Studies in AI Ethics

Ethical considerations around AI use should be explicit rather than implicit in education. Develop case studies that prompt students to engage with real ethical dilemmas:

  1. Attribution Discussions: Present scenarios where students must decide how to properly attribute AI contributions to their work. For example, if an AI helps to brainstorm ideas or provides an outline that a student substantially revises, how could this be acknowledged?
  2. Equity Considerations: Explore cases highlighting AI’s accessibility implications. Who benefits from these tools? Who might be disadvantaged? How might different cultural perspectives be underrepresented in AI outputs?
  3. Professional Standards: Discuss how different fields are developing guidelines for AI use. Medical students might examine how AI diagnostic tools should be used alongside human expertise, while creative writing students could debate the role of AI in authorship.
  4. Decision-Making Frameworks: Help students develop personal guidelines for when and how to use AI tools. What types of tasks might benefit from AI assistance? Where is independent human work essential?

These discussions help students develop thoughtful approaches to technology use that will serve them well beyond the classroom.

Implementation Tips for Educators

As you incorporate these approaches into your teaching, consider these practical suggestions:

  • Start small with one AI-focused activity before expanding to broader integration
  • Be transparent with students about your own learning curve with these technologies
  • Update your syllabus to clearly outline expectations for appropriate AI use
  • Document successes and challenges to refine your approach over time
  • Share experiences with colleagues to build institutional knowledge

Moving Beyond the AI Panic

The concept of postplagiarism does not mean abandoning academic integrity—rather, it calls for reimagining how we teach integrity in a technologically integrated world. By bringing AI tools directly into our teaching practices, we help students develop the critical thinking, evaluation skills, and ethical awareness needed to use these technologies responsibly.

When we shift our focus from preventing AI use to teaching with and about AI, we prepare students not just for academic success, but for thoughtful engagement with technology throughout their lives and careers.

References

Bearman, M., & Luckin, R. (2020). Preparing university assessment for a world with AI: Tasks for human intelligence. In M. Bearman, P. Dawson, R. Ajjawi, J. Tai, & D. Boud (Eds.), Re-imagining University Assessment in a Digital World (pp. 49-63). Springer International Publishing. https://doi.org/10.1007/978-3-030-41956-1_5 

Eaton, S. E. (2023). Postplagiarism: Transdisciplinary ethics and integrity in the age of artificial intelligence and neurotechnology. International Journal for Educational Integrity, 19(1), 1-10. https://doi.org/10.1007/s40979-023-00144-1

Edwards, B. (2023, April 6). Why ChatGPT and Bing Chat are so good at making things up. Arts Technica. https://arstechnica.com/information-technology/2023/04/why-ai-chatbots-are-the-ultimate-bs-machines-and-how-people-hope-to-fix-them/ 

________________________

Share this post: Embracing AI as a Teaching Tool: Practical Approaches for the Postplagiarism Classroom – https://drsaraheaton.com/2025/03/23/embracing-ai-as-a-teaching-tool-practical-approaches-for-the-post-plagiarism-classroom/

This blog has had over 3.7 million views thanks to readers like you. If you enjoyed this post, please ‘Like’ it using the button below or share it on social media. Thanks!

Sarah Elaine Eaton, PhD, is a Professor and Research Chair in the Werklund School of Education at the University of Calgary, Canada. Opinions are my own and do not represent those of my employer.


Neuralink’s Clinical Trials in Canada

January 11, 2025

Last month CBC’s Geoff Leo did a great article on called, ‘No consequences’, for violating human rights in privately funded research in Canada. This was a bit of an eye opener, even for me.

He writes that, “Roughly 85 per cent of clinical trials in Canada are privately funded” and that research undergoes very little scrutiny by anyone.

One of the cases Geoff wrote about involved a research study that ran from 2014-2016 involving Indigenous children in Saskatchewan, aged 12-15, who were research subjects in a study that monitored their brainwaves. Student participants were recruited with the help of a Canadian school board.

The study was led by James Hardt, who runs something called the Biocybernaut Institute, a privately run business. According to Leo, James Hardt claims that “brainwave training can make participants smarter, happier and enable them to overcome trauma. He said it can also allow them to levitate, walk on water and visit angels.”

Geoff Leo digs deep into some of the ethical issues and I recommend reading his article.

So, that was last month. This month, I happened to notice that according to Elon Musk’s Neuralink website, Musk’s product has now been approved by Health Canada to recruit research participants. There’s a bright purple banner at the top of the Neuralink home page showing a Canadian flag that says, “We’ve received approval from Health Canada to begin recruitment for our first clinical trial in Canada”.

A screenshot of the Neuralink.com home page. On the bottom right is a blurred photo of a man wearing a ball cap, who appears to be in a wheelchair and using tubes as medical assistance. There is white text on the right-hand side. At the top is a purple banner with white text and a small Canadian flag.

When you click on the link, you get to another page that shows the flags for the US, Canada, and the UK, where clinical trials are either underway or planned, it seems.

A screenshot of a webpage from the Neuralink web site. It has a white background with black text. In the upper left-hand corner there are three small flags, one each for the USA, Canada, and the UK.

The Canadian version is called CAN-PRIME. There’s a YouTube video promo/recruitment video for patients interested in joining, “this revolutionary journey”.

According to the website, “This study involves placing a small, cosmetically invisible implant in a part of the brain that plans movements. The device is designed to interpret a person’s neural activity, so they can operate a computer or smartphone by simply intending to move – no wires or physical movement are required.”

A screenshot from the Neuralink web page. The background is grey with black text.

So, just to connect the dots here… ten years ago in Canada there was a study involving neurotechnology that “exploited the hell out of” Indigenous kids, according to Janice Parente who leads the Human Research Standards Organization

Now we have Elon Musk’s company actively recruiting people from across Canada, the US, and the UK, for research that would involve implanting experimental technology into people’s brains without, it seems, much research ethics oversight at all.

What could possibly go wrong?

Reference

Leo, G. (2024, December 2). ‘No consequences’ for violating human rights in privately funded research in Canada, says ethics expert. https://www.cbc.ca/news/canada/saskatchewan/ethics-research-canada-privately-funded-1.7393063

________________________

Share this post: Neuralink’s Clinical Trials in Canada – https://drsaraheaton.com/2025/01/11/neuralinks-clinical-trials-in-canada/

This blog has had over 3.7 million views thanks to readers like you. If you enjoyed this post, please ‘Like’ it using the button below or share it on social media. Thanks!

Sarah Elaine Eaton, PhD, is a Professor and Research Chair in the Werklund School of Education at the University of Calgary, Canada. Opinions are my own and do not represent those of my employer. 


The GenAI Gender Gap

January 10, 2025

There is a gender gap when it comes to GenAI.

Just 26.3% of the European Union’s artificial intelligence (AI) professionals are women, according to a report from LinkedIn.

In my work with of the Women for Ethical AI (W4EAI) UNESCO platform, we had similar findings in our gender outlook study.

An AI-generated image of a group of women.

There are no easy solutions to this gap, but for those working in this area, some five concrete things you can do to promote gender inclusion (and equity in general) are:

  • 
Invite women into leadership roles, strategic planing for artificial intelligence and advanced technology.
  • Ensure that policies explicitly include women, girls, and other equity-deserving groups.
  • Invite women (and in particular, early career women and those who are precariously employed) to share and showcase their expertise and knowledge (and compensate them for their contributions).
  • Create formal sponsorship programs for women and girls who want to develop their knowledge and cp-competencies related to AI, with ongoing opportunities for learning and skill development.
An AI-generated image of a group of women.

There are a myriad of ethical complexities when it comes to artificial intelligence and gender is only one of them. Acknowledging inequalities and then working to support equity, fairness, and justice will remain ongoing work in the years to come.

References

AI in the EU: 2024 Trends and Insights from LinkedIn. (2024). https://economicgraph.linkedin.com/content/dam/me/economicgraph/en-us/PDF/AI-in-the-EU-Report.pdf

United Nations Educational Scientific and Cultural Organization (UNESCO). (2024). UNESCO Women for Ethical AI: Outlook study on artificial intelligence and gender. https://unesdoc.unesco.org/ark:/48223/pf0000391719

________________________

Share this post: The GenAI Gender Gap – https://drsaraheaton.com/2025/01/10/the-genai-gender-gap/

This blog has had over 3.7 million views thanks to readers like you. If you enjoyed this post, please ‘Like’ it using the button below or share it on social media. Thanks!

Sarah Elaine Eaton, PhD, is a Professor and Research Chair in the Werklund School of Education at the University of Calgary, Canada. Opinions are my own and do not represent those of my employer. 


Upcoming Talk: From Plagiarism to Postplagiarism: Navigating the GenAI Revolution in Higher Education

January 3, 2025
An promo announcement on a white background. There is a red stripe down the left-hand site. The University of Calgary logo appears on the top right. The following text is written in black, orange and red:
From Plagiarism to Postplagiarism: Navigating the GenAI Revolution in Higher Education
The first 2025 public presentation about #Postplagiarism
is now open for registration!

Free and open to the public.
Join us in person or via webinar.
January 29, 2025| 12:00 – 13:00 Mountain time

https://workrooms.ucalgary.ca/event/3854045

Join us for our first presentation of 2025:

From Plagiarism to Postplagiarism: Navigating the GenAI Revolution in Higher Education

Format: Hybrid (in person or live stream)

I am delighted to kick off a speaker series on GenAI hosted by my colleague, Dr. Soroush Sabbaghan, through the Centre for Artificial Intelligence Ethics, Literacy, and Integrity (CAIELI) at the University of Calgary.

Description

Generative AI (GenAI) is transforming teaching, learning, and assessment in higher education.

Learn to integrate GenAI effectively while maintaining academic integrity and enhancing student agency.

Dr. Sarah Eaton shares innovative strategies that promote critical thinking and original scholarship. Explore how GenAI reshapes academic practices and discover proactive approaches to leverage its potential.

This session equips educators, administrators, and policymakers to lead purposefully in a dynamic academic landscape.

Speaker bio

Sarah Elaine Eaton is a Professor and research chair at the Werklund School of Education at the University of Calgary (Canada). She is an award-winning educator, researcher, and leader. She leads transdisciplinary research teams focused on the ethical implications of advanced technology use in educational contexts. Dr. Eaton also holds a concurrent appointment as an Honorary Associate Professor, Deakin University, Australia.

More Details

Date: January 29, 2025

Time: 12:00 – 13:00 Mountain time

This talk is free and open to the public, but there are only 20 seats available to join us in person! We can also accommodate folks online.

Get more details and register here.

________________________

Share this post: Upcoming talk: From Plagiarism to Postplagiarism: Navigating the GenAI Revolution in Higher Education – https://drsaraheaton.com/2025/01/03/upcoming-talk-from-plagiarism-to-postplagiarism-navigating-the-genai-revolution-in-higher-education/

This blog has had over 3.7 million views thanks to readers like you. If you enjoyed this post, please “like” it or share it on social media. Thanks!

Sarah Elaine Eaton, PhD, is a Professor and Research Chair in the Werklund School of Education at the University of Calgary, Canada. Opinions are my own and do not represent those of my employer. 


Accreditation and Certification Fraud in IT

December 27, 2024

Many people in the academic integrity world are already familiar with contract cheating websites, those that deal in the business of buying and selling exams, bespoke term papers, theses, and exam questions and answers.

But this business isn’t limited to the millions of K-12, high school and post-secondary students. The exam fraud business is alive and well for professional accreditation exams for folks who either want to bypass a formal university degree or to supplement existing credentials.

For example, in the IT industry credentials are often the golden key to new opportunities. Certifications and accreditations (allegedly) validate technical skills, offering (so-called) proof that a candidate has the expertise needed for a professional role. 

Taking into account supply and demand, with job candidates being in high supply and well-paying jobs being in high demand, certification and accreditation fraud is alive and well in the IT industry, as well as other industries. This is a trend employers cannot afford to ignore.

Exposing the Fraud: How Buying and Selling of Certification Exam Questions Works

Certification fraud occurs when individuals falsify credentials, purchase counterfeit certifications, or misuse legitimate certifications obtained by others. But there’s this sneaky grey area that exists when a person actually sits a professional exam themselves, but they’ve prepared by buying the exam questions and/or the answers from an online vendor.

I won’t name specific companies that do this in this post, because I’m not in the habit of advertising for these fraudsters, but I want to show you how they work, so here are some screenshots:

Screenshot #1: Home page

A website screenshot. Black background, with text in white and blue. Some of the text is quoted in the narrative that follows.

At the top of this website, the company claims that 94% of the exam questions that they sold were “almost the same” and that 97% of customers passed the exam using their materials. (Who knows what happened to the other 3%…?) Finally, 98% of customers found the “study guides” effective and helpful.

There’s that phrase that we commonly see on contract cheating websites, “study guide”. For the uninitiated, this is a euphemism for “exam questions”. 

Screenshot #2: Saying it like it is: Not affiliated with any certification provider

A website screenshot. White background with black text.

In this screenshot the company states plainly that they are not affiliated or certified by any certification provider. Reading between the lines, the message is ‘caveat emptor’ or ‘buyer beware’. They are telling you upfront that they are in the business of selling exam questions and make no guarantees about their products.

Screenshot #3: Samples of accreditation exam questions for sale

A website screenshot. White background. Black task bar with light grey text. There are lists of texts written in blue.

Look at all the options: You can buy exam questions for certifications offered by DELL, English language proficiency exams, Citrix, Adobe, and Amazon, and Google just to name a few. 

Screenshot #4: More samples of certification provider exam questions for sale.

A website screenshot. White background. Black task bar with light grey text. There are lists of texts written in blue.

But wait! There’s more! You can buy exam questions for certifications offers by Oracle, IBM, SAP, and others.

Assessment Security

Sites like compromise the assessment security of certification exams that are meant to qualify individuals to do a particular job. If this term is new to you, I recommend Professor Phill Dawson’s book, Defending Assessment Security in a Digital World. For a quick (and free) overview, of Phill’s work, this slide deck from one of his presentations is worth checking out.

Businesses that buy and sell exam certification questions engage in fraudulent practices undermine trust in the certification system and create significant risks for employers.

Consequences for Employers

Hiring someone with counterfeit credentials can have dire consequences. Unqualified employees may lack the technical skills to handle complex tasks, leading to project delays, costly errors, or even security breaches. Beyond the financial impact, fraudulent certifications can erode team morale, as employees with genuine qualifications may feel undervalued when working alongside those who faked their way in.

What Employers and Hiring Managers can Do

Employers, and especially hiring managers and those working in HR, must take proactive steps to safeguard their hiring processes. Some of you may be asking if this kind of practice is actually illegal. I’m not a lawyer, but what I can say is that although contact cheating for students is illegal in countries like Australia, the UK, and Ireland, if you’re not a student, then you might get to live a proverbial grey zone. To the best of my knowledge, it is not actually illegal to buy and sell questions for professional certification exams in most countries of the world.

So, what can employers do? First, trust but verify! Verifying certifications directly with issuing organizations is one step. Many certification bodies offer easy online verification tools to confirm a candidate’s credentials. Additionally, employers should stay informed about recognized accreditation standards and avoid unverified institutions. 

Having said this, verification of credentials and certification won’t help if someone has bought exam questions online and then taken the test themselves. Their results could be ‘verifiable’ in a sense, because there’s an assumption that a person who has passed an exam had the knowledge to do so. But when someone buys their exam questions before sitting the test, it means that they have prepared for an exam and may not necessarily have internalized the knowledge or skills that should match the certification they receive from passing an exam. An exam is one measure of knowledge, but it isn’t the only one. 

Having prospective employees demonstrate their skills and respond to technical questions that could only be answered if the person has the knowledge to back up their documentation can also help. One possibility is to give an interviewee a real-world scenario that could happen at your organization. Ask them how they would go about problem-solving it. If they struggle or stumble, it could be a sign that they lack the necessary skills for the job. (It could also be a sign that they’re just nervous or that interviewing isn’t their strength. So let me also make a plug here for having an inclusive and equitable interviewing process.)

Investing in robust, inclusive, and equitable hiring practices not only protects an organization from the pitfalls of fraud but also helps to create a culture of accountability and excellence. By placing a premium on authentic certifications combined with demonstrable knowledge and skills and inclusive hiring practices, employers signal their commitment to integrity and ensure they are building a team of qualified professionals.

Bottom line: If you’re hiring someone who says they have an IT certification based on taking exams, it’s worth it to find out if they actually have the knowledge and skills to do the job. 

And this is just one example of one site. Rest assured that it is not the only one out there. Exam cheating companies like this one don’t exist in isolation. They’re in the game to make money, and lots of it. 

In an industry where skills and knowledge drive success, vigilance against certification and accreditation fraud is not optional—it is a driver of success.

Future Outlook

Fraud and corruption are alive and well education and industry. There is a growing community of sleuths, scholars, and activists who are ready to sniff out fraud and expose it and naïveté about these matters is quickly going out of fashion. 

There may have been a time when it was acceptable—or even fashionable—to clutch your pearls, proclaim moral outrage, or just refuse to accept that educational and professional fraud are more commonplace than you might have previously thought. GenAI is here to stay, and so are companies whose business is educational, accreditation, scientific, and professional fraud. These companies are profitable because they have customers willing to pay for their goods and services.

Vigilance, sleuthing, and exposing fraud are very much on trend as we move ahead into the new year. And if you’re a hiring manager, taking steps to protect the integrity of your operations is definitely part of the job in 2025 and beyond.

References and Further Reading

Carmichael, J. (2023, June 7). Understanding Fake Degrees and Credential Fraud in Higher Ed. The Evollution: A Modern Campus Illumination. https://evolllution.com/programming/credentials/understanding-fake-degrees-and-credential-fraud-in-higher-ed/

Eaton, S. E., & Carmichael, J. (2022). The Ecosystem of Commercial Academic Fraud. In. Calgary, Canada: University of Calgary. https://dx.doi.org/10.11575/PRISM/40330

Eaton, S. E., Carmichael, J., & Pethrick, H. (Eds.). (2023). Fake degrees and credential fraud in higher education. Springer. https://doi.org/10.1007/978-3-031-21796-8

Related posts

________________________

Share this post: Accreditation and Certification Fraud in IT – https://drsaraheaton.com/2024/12/27/accreditation-and-certification-fraud-in-the-it-world/

This blog has had over 3 million views thanks to readers like you. If you enjoyed this post, please “like” it or share it on social media. Thanks!

Sarah Elaine Eaton, PhD, is a Professor and Research Chair in the Werklund School of Education at the University of Calgary, Canada. Opinions are my own and do not represent those of my employer.