Skip to content

Technology & MisinformationActivities & Teaching Strategies

Active learning helps students confront misinformation head-on by making its mechanics visible. When students analyze viral stories, simulate algorithmic influence, or examine their own echo chambers, they move beyond abstract warnings to hands-on evidence of how digital systems shape belief.

Grade 12Canadian & World Studies3 activities25 min75 min

Learning Objectives

  1. 1Analyze the impact of social media algorithms on the formation of political opinions and civic engagement.
  2. 2Evaluate the credibility of digital news sources and social media content using established fact-checking methodologies.
  3. 3Explain the distinction between misinformation and disinformation and their respective effects on democratic processes.
  4. 4Design a personal digital literacy action plan to mitigate the influence of online propaganda and echo chambers.
  5. 5Compare and contrast the effectiveness of different strategies for combating the spread of false information online.

Want a complete lesson plan with these objectives? Generate a Mission

50 min·Small Groups

Inquiry Circle: The Anatomy of a Viral Story

Small groups trace the origin and spread of a specific piece of 'fake news' or a viral political claim. They identify the techniques used to make it believable and the platforms that helped it spread, presenting their findings as a 'Misinformation Map.'

Prepare & details

Analyze how social media is changing democratic processes and civic engagement.

Facilitation Tip: During 'The Anatomy of a Viral Story,' assign each group a different viral post to dissect, ensuring coverage of diverse topics and platforms to highlight common misinformation tactics.

Setup: Groups at tables with access to source materials

Materials: Source material collection, Inquiry cycle worksheet, Question generation protocol, Findings presentation template

AnalyzeEvaluateCreateSelf-ManagementSelf-Awareness
75 min·Whole Class

Simulation Game: Regulating the Platforms

Students represent different stakeholders (tech CEOs, government regulators, civil liberties advocates, and journalists). They must negotiate a set of rules for how social media companies should handle hate speech and political advertising.

Prepare & details

Explain the threats that misinformation and disinformation pose to democratic societies.

Facilitation Tip: In 'Regulating the Platforms,' set clear time limits for the simulation to force students to prioritize trade-offs between free speech and harm reduction.

Setup: Flexible space for group stations

Materials: Role cards with goals/resources, Game currency or tokens, Round tracker

ApplyAnalyzeEvaluateCreateSocial AwarenessDecision-Making
25 min·Pairs

Think-Pair-Share: The 'Echo Chamber' Effect

Students analyze their own social media feeds or search results. They discuss with a partner how algorithms might be limiting the diversity of viewpoints they see and what strategies they can use to 'break out' of their digital bubble.

Prepare & details

Design strategies for citizens to become critical consumers of digital information.

Facilitation Tip: For 'The 'Echo Chamber' Effect,' have students silently note their own online habits first, then discuss to reveal personal blind spots in their media consumption.

Setup: Standard classroom seating; students turn to a neighbor

Materials: Discussion prompt (projected or printed), Optional: recording sheet for pairs

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills

Teaching This Topic

Teachers should avoid lecturing about misinformation and instead use structured activities that reveal its mechanisms. Begin with low-stakes analysis before moving to simulations, as students need to see how algorithms and behavior interact. Avoid assuming students already know how to detect deception; model the process explicitly with examples they recognize.

What to Expect

Students will demonstrate critical digital literacy by identifying misinformation strategies, explaining algorithmic bias, and proposing solutions to reduce echo chambers. They will apply these skills in real-time discussions and simulations, not just recall facts about the topic.

These activities are a starting point. A full mission is the experience.

  • Complete facilitation script with teacher dialogue
  • Printable student materials, ready for class
  • Differentiation strategies for every learner
Generate a Mission

Watch Out for These Misconceptions

Common MisconceptionDuring 'The Anatomy of a Viral Story,' students might claim misinformation only affects people with less education.

What to Teach Instead

Use the group’s analysis of the viral post to highlight how emotional triggers, confirmation bias, and algorithmic amplification affect everyone, regardless of background.

Common MisconceptionDuring 'Regulating the Platforms,' students may argue social media companies are neutral platforms.

What to Teach Instead

Refer to the simulation’s algorithm cards to show how engagement metrics and ad revenue shape content distribution, not neutrality.

Assessment Ideas

Discussion Prompt

After 'The 'Echo Chamber' Effect,' facilitate a discussion where students share strategies for resisting echo chambers, citing examples from their own online experiences.

Quick Check

During 'The Anatomy of a Viral Story,' present students with three posts (fact, misinformation, disinformation) and ask them to identify each while writing one sentence explaining their reasoning, using specific credibility indicators.

Exit Ticket

After 'Regulating the Platforms,' ask students to write one specific action they will take within a week to consume digital information more critically, such as fact-checking an article before sharing.

Extensions & Scaffolding

  • Challenge students to create a counter-narrative to a viral misinformation post they encountered, using reliable sources and persuasive design principles.
  • For students struggling with bias awareness, provide a checklist of common misinformation tactics to reference during their analysis.
  • Deeper exploration: Have students research a recent policy debate about social media regulation and present how algorithms influenced the outcome.

Key Vocabulary

DisinformationFalse information deliberately created and spread to deceive or manipulate a target audience, often for political or financial gain.
MisinformationFalse or inaccurate information, regardless of intent to deceive. It can be spread unintentionally.
Algorithmic BiasSystematic and repeatable errors in a computer system that create unfair outcomes, such as prioritizing certain types of content or user interactions.
Echo ChamberA metaphorical description of a situation where information, ideas, or beliefs are amplified or reinforced by communication and repetition inside a defined system, often isolating individuals from differing viewpoints.
Digital CitizenshipThe responsible, ethical, and safe use of technology and digital platforms, including understanding rights and responsibilities online.

Ready to teach Technology & Misinformation?

Generate a full mission with everything you need

Generate a Mission