Skip to content

Social Media and Information IntegrityActivities & Teaching Strategies

Active learning works for this topic because algorithms operate invisibly but shape what students see every day. Hands-on simulations and debates make abstract processes concrete, helping learners transfer these concepts to their own feeds and choices.

JC 1Computing4 activities35 min50 min

Learning Objectives

  1. 1Analyze how algorithmic personalization on social media platforms contributes to the formation of filter bubbles.
  2. 2Evaluate the ethical responsibilities of technology companies in moderating online content and combating misinformation.
  3. 3Critique the impact of the attention economy on user mental health and public discourse.
  4. 4Compare the effectiveness of different digital literacy strategies in identifying and refuting online misinformation.
  5. 5Synthesize findings on algorithmic bias to propose design improvements for more equitable information dissemination.

Want a complete lesson plan with these objectives? Generate a Mission

45 min·Small Groups

Simulation Game: Algorithm Feed Creator

Small groups receive sample user data and news articles, then define 3-5 rules for a recommendation algorithm focused on engagement. Apply rules to generate personalized feeds and compare results across groups to spot filter bubbles. Conclude with a class chart of emerging biases.

Prepare & details

How do recommendation engines shape our perception of reality?

Facilitation Tip: During Algorithm Feed Creator, circulate with a checklist to ensure every group tests at least one extreme rule change, not just minor tweaks.

Setup: Flexible space for group stations

Materials: Role cards with goals/resources, Game currency or tokens, Round tracker

ApplyAnalyzeEvaluateCreateSocial AwarenessDecision-Making
50 min·Pairs

Formal Debate: Moderation Responsibilities

Pairs research arguments for and against tech companies aggressively moderating content. Present in a structured debate format with rebuttals, then vote and reflect on how rules affect free speech versus integrity. Teacher facilitates with real platform policy examples.

Prepare & details

What role should tech companies play in moderating online content?

Facilitation Tip: Before the Moderation Responsibilities debate, assign clear roles so students must represent multiple stakeholder perspectives, not just their own.

Setup: Two teams facing each other, audience seating for the rest

Materials: Debate proposition card, Research brief for each side, Judging rubric for audience, Timer

AnalyzeEvaluateCreateSelf-ManagementDecision-Making
35 min·Small Groups

Fact-Check Relay: Misinformation Hunt

Teams of four relay-race style: one member finds a viral post, next verifies sources, third assesses algorithmic boost potential, fourth summarizes risks. Rotate roles twice and share findings in a whole-class gallery walk.

Prepare & details

How does the attention economy affect the mental health of users?

Facilitation Tip: For the Misinformation Hunt, require students to document both the false claim and the specific algorithmic mechanic that spread it.

Setup: Wall space or tables arranged around room perimeter

Materials: Large paper/poster boards, Markers, Sticky notes for feedback

UnderstandApplyAnalyzeCreateRelationship SkillsSocial Awareness
40 min·Individual

Audit: Personal Bubble Breaker

Individuals log a week's social media feed, categorize content themes, and identify gaps. In pairs, swap audits to suggest diverse follows, then discuss mental health links in whole class.

Prepare & details

How do recommendation engines shape our perception of reality?

Facilitation Tip: In the Personal Bubble Breaker audit, ask students to include screenshots and timestamps to make their filter bubbles visible and discussable.

Setup: Wall space or tables arranged around room perimeter

Materials: Large paper/poster boards, Markers, Sticky notes for feedback

UnderstandApplyAnalyzeCreateRelationship SkillsSocial Awareness

Teaching This Topic

Teach this topic by starting with students' own experiences, then layering technical concepts. Avoid lectures about algorithms until learners have felt their effects firsthand. Research shows that students retain algorithm literacy best when they design, break, and audit systems before studying their mechanics.

What to Expect

Successful learning looks like students articulating how recommendation systems prioritize engagement over truth and proposing platform changes that balance diversity with safety. They should critique their own feeds and defend moderation decisions with evidence.

These activities are a starting point. A full mission is the experience.

  • Complete facilitation script with teacher dialogue
  • Printable student materials, ready for class
  • Differentiation strategies for every learner
Generate a Mission

Watch Out for These Misconceptions

Common MisconceptionDuring Algorithm Feed Creator, some students may assume the algorithm ranks content purely by recency or popularity.

What to Teach Instead

During Algorithm Feed Creator, direct students to adjust the engagement weight slider to 0% and observe how the feed still shows biased content, proving relevance is not neutral.

Common MisconceptionDuring Personal Bubble Breaker, students might believe their feeds reflect their own choices alone.

What to Teach Instead

During Personal Bubble Breaker, ask students to compare their audit results with peers who share interests but live in different regions, revealing how algorithms curate for location and demographics.

Common MisconceptionDuring Misinformation Hunt, students may think false claims spread only because people share them intentionally.

What to Teach Instead

During Misinformation Hunt, have students trace how the same claim gets pushed to different feeds by the algorithm before any user shares it, showing amplification beyond intent.

Assessment Ideas

Discussion Prompt

After Moderation Responsibilities, pose the question: 'If a social media platform removes a piece of content that some users find valuable but others find harmful, what are the ethical trade-offs?' Use the debate transcript to assess whether students cite specific moderation decisions and their consequences.

Quick Check

During Algorithm Feed Creator, present students with a hypothetical feed designed to create a filter bubble. Ask them to identify three specific types of content missing from the feed and explain why their absence might be problematic, then collect responses to assess understanding of algorithmic narrowing.

Exit Ticket

After Personal Bubble Breaker, ask students to write down one strategy they can use to break out of a filter bubble and one question they would ask a social media platform's CEO about content moderation policies, to assess personalization of learning and critical questioning.

Extensions & Scaffolding

  • Challenge students who finish early to redesign the algorithm’s engagement rules to reduce polarization while maintaining user satisfaction, then compare designs in a gallery walk.
  • Scaffolding: Provide pre-filtered feeds for students who struggle with the Personal Bubble Breaker to practice identifying missing viewpoints before auditing their own.
  • Deeper exploration: Invite a local journalist or fact-checker to discuss how misinformation targets specific communities, connecting algorithmic amplification to real-world harm.

Key Vocabulary

AlgorithmA set of rules or instructions followed by a computer to solve a problem or perform a task, often used by social media to curate content.
Filter BubbleA state of intellectual isolation that can result from personalized searches and content feeds, where users are primarily exposed to information that confirms their existing beliefs.
MisinformationFalse or inaccurate information, especially that which is deliberately intended to deceive or mislead.
Attention EconomyA business model that treats human attention as a scarce commodity, optimizing platforms to capture and retain user engagement.
Echo ChamberA metaphorical description of a situation where information, ideas, or beliefs are amplified or reinforced by communication and repetition inside a defined system, often leading to a closed-off perspective.

Ready to teach Social Media and Information Integrity?

Generate a full mission with everything you need

Generate a Mission