Skip to content
Civics & Government · 9th Grade

Active learning ideas

The Influence of Social Media in Campaigns

Active learning works for this topic because the abstract mechanics of micro-targeting and algorithmic sorting become concrete when students experience them directly. Simulations and audits force learners to confront the gap between what they assume happens online and how platforms actually operate, building critical digital literacy.

Common Core State StandardsC3: D2.Civ.10.9-12C3: D2.Civ.7.9-12
40–50 minPairs → Whole Class4 activities

Activity 01

Gallery Walk45 min · Small Groups

Micro-Targeting Simulation

Groups receive four different voter profile cards and design three different versions of the same campaign message tailored for each audience. Debrief centers on what changed across versions, what remained constant, and what the democratic implications are when different voter groups never see each other's message. Students identify whether any version made claims that would be contradicted by another version.

Analyze how algorithmic sorting affects the information voters receive.

Facilitation TipIn the Micro-Targeting Simulation, assign roles to students so they must create tailored messages for specific voter profiles based on real demographic data.

What to look forPose the question: 'Should social media companies be held responsible for fact-checking political ads they host?' Facilitate a debate where students must cite specific examples of micro-targeting or algorithmic sorting to support their arguments for or against platform responsibility.

UnderstandApplyAnalyzeCreateRelationship SkillsSocial Awareness
Generate Complete Lesson

Activity 02

Gallery Walk40 min · Small Groups

Algorithm Audit: What Does Your Feed Show?

Students document what political content appears in their own social media feeds over three days, categorizing sources, topics, and emotional tone using a shared coding framework the class develops together. Groups compare their findings and analyze whether their feeds showed similar or sharply divergent political information environments. The comparison itself is the primary analytical tool.

Justify whether social media companies should be responsible for fact-checking political ads.

Facilitation TipDuring the Algorithm Audit, have students screenshot or record their feeds before and after clearing browser history to isolate algorithmic influence.

What to look forPresent students with two hypothetical voter profiles (e.g., a young urban renter, an older rural homeowner). Ask them to write one sentence describing a political message each profile might receive via micro-targeting and explain why that message is tailored to them.

UnderstandApplyAnalyzeCreateRelationship SkillsSocial Awareness
Generate Complete Lesson

Activity 03

Formal Debate45 min · Whole Class

Formal Debate: Platform Responsibility for Political Ads

Half the class argues that social media platforms should be required to fact-check political ads; the other half argues this creates dangerous censorship risks. Both sides must engage with Section 230 of the Communications Decency Act and First Amendment considerations. Each side must also address the strongest counterargument before offering their rebuttal.

Explain how the 'permanent campaign' has changed the way politicians govern.

Facilitation TipIn the Structured Debate, provide each side with a specific platform policy document to ground arguments in evidence.

What to look forAsk students to write a short paragraph explaining how the concept of a 'permanent campaign' might lead a politician to prioritize social media engagement over traditional town hall meetings. They should connect this to the idea of micro-targeting.

AnalyzeEvaluateCreateSelf-ManagementDecision-Making
Generate Complete Lesson

Activity 04

Case Study Analysis50 min · Small Groups

Case Study Analysis: Cambridge Analytica

Small groups analyze the Cambridge Analytica data scandal (2016-2018), tracing how user data was harvested from Facebook, what it was used for in political targeting, what the legal outcome was, and what -- if anything -- changed in platform data practices afterward. Groups present what they consider the most important unanswered question from the case.

Analyze how algorithmic sorting affects the information voters receive.

Facilitation TipFor the Cambridge Analytica case study, assign small groups to map out the data supply chain from user behavior to targeted messaging.

What to look forPose the question: 'Should social media companies be held responsible for fact-checking political ads they host?' Facilitate a debate where students must cite specific examples of micro-targeting or algorithmic sorting to support their arguments for or against platform responsibility.

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

Templates

Templates that pair with these Civics & Government activities

Drop them into your lesson, edit them, and print or share.

A few notes on teaching this unit

Teachers should approach this topic by treating students as active participants in media literacy rather than passive recipients of information. Start with the personal—ask students to audit their own feeds—then move to structural analysis of platform policies. Avoid framing social media as purely negative; acknowledge its utility while interrogating its design flaws. Research shows students retain more when they confront their own assumptions directly, so use guided reflection after simulations to connect personal experience to broader democratic implications.

Successful learning looks like students accurately explaining how micro-targeting functions, identifying algorithmic bias in their own feeds, and articulating the political implications of platform design. They should move from general awareness to specific, evidence-based analysis of how social media shapes campaigns.


Watch Out for These Misconceptions

  • During the Micro-Targeting Simulation, some may assume the activity is about generic social media use rather than recognizing it models how platforms segment audiences for political messaging.

    During the Micro-Targeting Simulation, explicitly frame the role-play as a replication of how campaigns use voter data to tailor messages for different segments, then have students compare their drafts to identify how content changes by audience.

  • During the Algorithm Audit, students might believe their feeds show a random sample of content rather than recognizing the influence of engagement-driven algorithms.

    During the Algorithm Audit, ask students to document how their feeds change after clearing cookies and explain why the differences reveal algorithmic prioritization of certain content types.

  • During the Structured Debate, students may assume platforms uniformly fact-check political content and that this is a straightforward technical fix.

    During the Structured Debate, provide students with platform policy timelines (e.g., Meta’s shifting rules on political ads) to show that fact-checking exemptions are policy choices, not technical gaps.


Methods used in this brief