Skip to content
Language Arts · Grade 12

Active learning ideas

Algorithms and Filter Bubbles

Active learning makes abstract concepts like algorithms and filter bubbles tangible by having students experience firsthand how digital systems curate content. Role-playing, audits, and debates transform passive observation into active analysis, helping students see bias in the feeds they use every day.

Ontario Curriculum ExpectationsCCSS.ELA-LITERACY.RI.11-12.7CCSS.ELA-LITERACY.SL.11-12.3
30–50 minPairs → Whole Class4 activities

Activity 01

Simulation Game35 min · Small Groups

Simulation Game: Algorithm Curation Role-Play

Provide students with a pool of 20 articles on a current issue. In small groups, they act as algorithms: first round selects based on 'user likes,' second reinforces similarities, third narrows further. Groups present their evolving 'feeds' and note viewpoint shifts.

Analyze how filter bubbles limit our exposure to diverse perspectives and impact democratic discourse.

Facilitation TipDuring Algorithm Curation Role-Play, assign students roles like 'algorithm,' 'user,' and 'content creator' to physically demonstrate how engagement metrics narrow feeds over time.

What to look forPose the question: 'To what extent are you responsible for seeking out diverse perspectives when your social media feed is designed to keep you engaged with familiar content?' Facilitate a class debate, asking students to cite specific examples of how algorithms might limit or enable their information seeking.

ApplyAnalyzeEvaluateCreateSocial AwarenessDecision-Making
Generate Complete Lesson

Activity 02

Case Study Analysis40 min · Pairs

Feed Audit: Personal Bubble Mapping

Students screenshot their social media feeds, categorize content by perspective (agree, oppose, neutral). In pairs, they map patterns and predict algorithmic influences. Pairs share findings in a whole-class gallery walk.

Assess the extent to which individuals are responsible for the information they consume in an automated environment.

Facilitation TipFor Feed Audit: Personal Bubble Mapping, model how to categorize content types and timestamps before students work independently to avoid overwhelm.

What to look forAsk students to list three specific types of content they frequently see on a chosen social media platform and then identify one type of content they rarely or never see. Have them briefly hypothesize why the algorithm might be prioritizing the former and excluding the latter.

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

Activity 03

Formal Debate50 min · Whole Class

Formal Debate: Responsibility in Automated Feeds

Divide class into teams to debate: 'Individuals bear full responsibility for diverse consumption despite algorithms.' Prep evidence from readings, then debate with timed rebuttals. Conclude with personal action plans.

Explain how the speed of digital information transmission affects the depth of public understanding.

Facilitation TipIn Speed vs. Depth: Info Transmission Challenge, time teams strictly to highlight how urgency reduces analysis, and debrief immediately afterward.

What to look forOn an index card, have students define 'filter bubble' in their own words and then describe one potential consequence of living within one for democratic discourse. Collect these at the end of class to gauge understanding.

AnalyzeEvaluateCreateSelf-ManagementDecision-Making
Generate Complete Lesson

Activity 04

Case Study Analysis30 min · Pairs

Speed vs. Depth: Info Transmission Challenge

Pairs receive a fast-spreading tweet or meme. One pair fact-checks quickly (2 min), another deeply (10 min). Compare accuracy and insights in group debrief, linking to key questions.

Analyze how filter bubbles limit our exposure to diverse perspectives and impact democratic discourse.

Facilitation TipDuring the Debate: Responsibility in Automated Feeds, provide sentence stems for claims and counterclaims to support students who hesitate to articulate complex ideas.

What to look forPose the question: 'To what extent are you responsible for seeking out diverse perspectives when your social media feed is designed to keep you engaged with familiar content?' Facilitate a class debate, asking students to cite specific examples of how algorithms might limit or enable their information seeking.

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

Templates

Templates that pair with these Language Arts activities

Drop them into your lesson, edit them, and print or share.

A few notes on teaching this unit

Teachers should ground this topic in students' lived experiences by starting with their own feeds before introducing theory. Avoid lecturing about algorithms; instead, use guided discovery so students uncover bias through structured activities. Research suggests that when students confront their own filter bubbles directly, they develop deeper skepticism and stronger analytical habits than through abstract discussion alone.

Successful learning looks like students articulating how algorithms prioritize engagement over neutrality, identifying their own filter bubbles, and debating the ethical responsibilities of automated curation. Evidence includes clear examples from simulations, audits, and debates that link behavior to outcomes in digital spaces.


Watch Out for These Misconceptions

  • During Algorithm Curation Role-Play, some students may assume the simulation reflects neutral curation because it uses 'data.'

    Use the debrief to explicitly contrast the role-play’s narrowing feeds with the claim of neutrality, asking students to point to moments when engagement metrics overrode variety in their group’s process.

  • During Feed Audit: Personal Bubble Mapping, students might blame algorithms for their bubble without examining their own behavior.

    Prompt students to reflect on what types of content they seek, save, or ignore in their audit notes, then discuss how platform design amplifies those habits.

  • During Speed vs. Depth: Info Transmission Challenge, students may believe faster information always improves understanding.

    Use the timed results to highlight how speed reduces verification by asking teams to compare their post-challenge fact-checking strategies with their initial reactions.


Methods used in this brief