Skip to content
Canadian & World Studies · Grade 12

Active learning ideas

Technology & Misinformation

Active learning helps students confront misinformation head-on by making its mechanics visible. When students analyze viral stories, simulate algorithmic influence, or examine their own echo chambers, they move beyond abstract warnings to hands-on evidence of how digital systems shape belief.

Ontario Curriculum ExpectationsON: Ideas, Ideologies, and Culture - Grade 12ON: Global Issues and Challenges - Grade 12
25–75 minPairs → Whole Class3 activities

Activity 01

Inquiry Circle50 min · Small Groups

Inquiry Circle: The Anatomy of a Viral Story

Small groups trace the origin and spread of a specific piece of 'fake news' or a viral political claim. They identify the techniques used to make it believable and the platforms that helped it spread, presenting their findings as a 'Misinformation Map.'

Analyze how social media is changing democratic processes and civic engagement.

Facilitation TipDuring 'The Anatomy of a Viral Story,' assign each group a different viral post to dissect, ensuring coverage of diverse topics and platforms to highlight common misinformation tactics.

What to look forPose the question: 'How can an individual actively resist the formation of an echo chamber online?' Facilitate a class discussion where students share strategies and provide examples of how they have encountered or avoided echo chambers in their own online experiences.

AnalyzeEvaluateCreateSelf-ManagementSelf-Awareness
Generate Complete Lesson

Activity 02

Simulation Game75 min · Whole Class

Simulation Game: Regulating the Platforms

Students represent different stakeholders (tech CEOs, government regulators, civil liberties advocates, and journalists). They must negotiate a set of rules for how social media companies should handle hate speech and political advertising.

Explain the threats that misinformation and disinformation pose to democratic societies.

Facilitation TipIn 'Regulating the Platforms,' set clear time limits for the simulation to force students to prioritize trade-offs between free speech and harm reduction.

What to look forPresent students with three short online articles or social media posts, one clearly factual, one containing misinformation, and one containing disinformation. Ask them to identify which is which and write one sentence explaining their reasoning for each, citing specific indicators of credibility or deception.

ApplyAnalyzeEvaluateCreateSocial AwarenessDecision-Making
Generate Complete Lesson

Activity 03

Think-Pair-Share25 min · Pairs

Think-Pair-Share: The 'Echo Chamber' Effect

Students analyze their own social media feeds or search results. They discuss with a partner how algorithms might be limiting the diversity of viewpoints they see and what strategies they can use to 'break out' of their digital bubble.

Design strategies for citizens to become critical consumers of digital information.

Facilitation TipFor 'The 'Echo Chamber' Effect,' have students silently note their own online habits first, then discuss to reveal personal blind spots in their media consumption.

What to look forAsk students to write down one specific action they will take in the next week to be a more critical consumer of digital information. Examples could include fact-checking a shared article before forwarding or seeking out diverse news sources on a particular topic.

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
Generate Complete Lesson

A few notes on teaching this unit

Teachers should avoid lecturing about misinformation and instead use structured activities that reveal its mechanisms. Begin with low-stakes analysis before moving to simulations, as students need to see how algorithms and behavior interact. Avoid assuming students already know how to detect deception; model the process explicitly with examples they recognize.

Students will demonstrate critical digital literacy by identifying misinformation strategies, explaining algorithmic bias, and proposing solutions to reduce echo chambers. They will apply these skills in real-time discussions and simulations, not just recall facts about the topic.


Watch Out for These Misconceptions

  • During 'The Anatomy of a Viral Story,' students might claim misinformation only affects people with less education.

    Use the group’s analysis of the viral post to highlight how emotional triggers, confirmation bias, and algorithmic amplification affect everyone, regardless of background.

  • During 'Regulating the Platforms,' students may argue social media companies are neutral platforms.

    Refer to the simulation’s algorithm cards to show how engagement metrics and ad revenue shape content distribution, not neutrality.


Methods used in this brief