Skip to content
Citizenship · Year 9

Active learning ideas

Social Media and Algorithms

Active learning turns abstract algorithmic processes into tangible experiences students can critique and shape. By building, debating, and auditing feeds, students move from passive consumption to active design, which strengthens understanding of how bias enters systems and affects real lives.

National Curriculum Attainment TargetsKS3: Citizenship - The Role of the MediaKS3: Citizenship - Information and Communication
30–50 minPairs → Whole Class4 activities

Activity 01

Simulation Game45 min · Pairs

Simulation Game: Build Your Own Algorithm

Students in pairs input sample user data into a simple flowchart template to generate personalised news feeds from a shared pool of articles. They swap feeds with another pair and discuss how choices lead to echo chambers. Conclude with a class vote on biased outcomes.

Analyze the government's role in regulating the algorithms of private tech giants.

Facilitation TipDuring the simulation, circulate with sticky notes labeled ‘engagement’ and ‘fairness’ to help students notice when their algorithm favours one goal over the other.

What to look forPose the question: 'Imagine you are a policymaker. What are the top three challenges in creating fair regulations for social media algorithms that protect free speech while combating misinformation?' Allow students to discuss in small groups before sharing key points with the class.

ApplyAnalyzeEvaluateCreateSocial AwarenessDecision-Making
Generate Complete Lesson

Activity 02

Concept Mapping50 min · Small Groups

Debate Carousel: Regulation vs Freedom

Divide class into small groups representing stakeholders like government, tech firms, and users. Groups rotate to defend or challenge positions on algorithm regulation using prepared evidence cards. Each rotation ends with a 2-minute synthesis statement.

Evaluate whether social media enhances or diminishes the quality of democratic debate.

Facilitation TipFor the debate carousel, place a large sheet of paper at each station with a blank T-chart so students record points for and against regulation as others rotate.

What to look forPresent students with two hypothetical social media feeds, one designed to create an echo chamber and another to promote diverse viewpoints. Ask students to identify 2-3 specific algorithmic choices that would lead to each feed and explain their reasoning.

UnderstandAnalyzeCreateSelf-AwarenessSelf-Management
Generate Complete Lesson

Activity 03

Concept Mapping40 min · Small Groups

Policy Design Workshop: Misinformation Rules

In small groups, students review real social media posts flagged for misinformation and draft a 5-point policy with enforcement steps. Groups pitch to the class, which votes and refines the best ideas into a class charter.

Design a just policy to combat the spread of digital misinformation on social platforms.

Facilitation TipIn the policy workshop, provide a ‘constraints bank’ on cards (e.g., ‘must preserve free speech’, ‘must reduce misinformation’) to guide realistic design choices.

What to look forOn an index card, have students write one specific example of how an algorithm might influence a user's political opinion. Then, ask them to suggest one action a user could take to counteract this influence.

UnderstandAnalyzeCreateSelf-AwarenessSelf-Management
Generate Complete Lesson

Activity 04

Concept Mapping30 min · Individual

Feed Analysis: Personal Audit

Individually, students screenshot their social media feed, categorise content by viewpoint, and calculate diversity scores. Share anonymised results in whole class discussion to reveal patterns.

Analyze the government's role in regulating the algorithms of private tech giants.

Facilitation TipFor the feed analysis, give students a two-column template with prompts like ‘What did the algorithm predict I’d like?’ and ‘What did I actually engage with?’ to structure their audit.

What to look forPose the question: 'Imagine you are a policymaker. What are the top three challenges in creating fair regulations for social media algorithms that protect free speech while combating misinformation?' Allow students to discuss in small groups before sharing key points with the class.

UnderstandAnalyzeCreateSelf-AwarenessSelf-Management
Generate Complete Lesson

A few notes on teaching this unit

Start with concrete, personal data—students’ own feeds—so they see algorithms as real forces, not abstract concepts. Use low-stakes simulations to make bias visible before asking them to critique or regulate it. Research shows that when students design systems themselves, they grasp limitations more deeply and avoid simplistic ‘good vs bad’ judgements.

Students will demonstrate they can identify algorithmic bias, explain its societal effects, and propose practical solutions. Success shows in precise language during debates, thoughtful policy designs, and clear audit reflections that link personal experiences to broader democratic concerns.


Watch Out for These Misconceptions

  • During Build Your Own Algorithm, students may believe their model treats all content fairly if they include a ‘balance’ slider.

    During Build Your Own Algorithm, circulate and ask groups to explain why their ‘balance’ slider actually favours familiar content, using the engagement scores taped to each post as evidence.

  • During Feed Analysis, students may claim their personal feed shows no bias because they chose to follow particular accounts.

    During Feed Analysis, have students compare their completed templates in pairs and circle any moments where the algorithm predicted their interests before they engaged, prompting reflection on passive acceptance of bias.

  • During Regulation vs Freedom, students may argue social media always improves democracy by giving everyone a voice.

    During Regulation vs Freedom, pause the carousel after the first round and ask students to note examples of polarising or false content on their T-charts, then require each new speaker to address one of those examples before adding new points.


Methods used in this brief