Skip to content

Propaganda in the Digital AgeActivities & Teaching Strategies

Active learning helps 9th graders grasp propaganda’s digital mechanics by having them analyze real examples rather than just hear lectures. When students break down viral posts themselves, they see firsthand how emotions and algorithms shape what they read and share.

9th GradeEnglish Language Arts3 activities20 min50 min

Learning Objectives

  1. 1Analyze how specific digital platform features, such as algorithms and infinite scroll, amplify persuasive messaging.
  2. 2Compare the reach and speed of modern digital propaganda to historical examples using case studies.
  3. 3Evaluate the credibility of digital persuasive content by identifying common propaganda techniques and logical fallacies.
  4. 4Predict the potential impact of AI-generated content on the future landscape of persuasive messaging.

Want a complete lesson plan with these objectives? Generate a Mission

50 min·Small Groups

Inquiry Circle: Anatomy of a Viral Post

Provide small groups with three real or carefully constructed examples of viral social media posts: one factually accurate, one misleading, and one outright false. Groups apply the SIFT method (Stop, Investigate, Find better coverage, Trace claims) to each, then present their verification process and findings to the class. The debrief focuses on what made each post initially convincing.

Prepare & details

How has the internet changed the speed and reach of persuasive messaging?

Facilitation Tip: During Collaborative Investigation: Anatomy of a Viral Post, assign each group a different post type (meme, news headline, TikTok) so they compare how format influences persuasion.

Setup: Groups at tables with access to source materials

Materials: Source material collection, Inquiry cycle worksheet, Question generation protocol, Findings presentation template

AnalyzeEvaluateCreateSelf-ManagementSelf-Awareness
20 min·Pairs

Think-Pair-Share: Before You Share

Students individually recall a time they almost shared something online that turned out to be misleading, or observed someone else share false information. Pairs discuss what made the content seem credible and what stopped them (or didn't stop the other person). The class builds a shared list of red flags that should slow down a share decision.

Prepare & details

Critique the effectiveness of modern digital propaganda compared to historical examples.

Facilitation Tip: For Think-Pair-Share: Before You Share, have students record their initial reactions privately before discussing to reduce social pressure that might override critical thinking.

Setup: Standard classroom seating; students turn to a neighbor

Materials: Discussion prompt (projected or printed), Optional: recording sheet for pairs

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
30 min·Whole Class

Whole Class Discussion: Platform Design and Amplification

Present data on how emotionally charged or outrage-inducing content tends to spread faster on social platforms. Facilitate a discussion about responsibility: the individual poster, the platform, the algorithm designers, or the audience. Students must cite specific evidence and reasoning rather than stating opinions without support, modeling the argumentative skills developed throughout the unit.

Prepare & details

Predict the future impact of AI on the creation and dissemination of persuasive content.

Facilitation Tip: In Whole Class Discussion: Platform Design and Amplification, ask students to bring examples of platform features that encourage sharing to ground the conversation in observable evidence.

Setup: Groups at tables with case materials

Materials: Case study packet (3-5 pages), Analysis framework worksheet, Presentation template

AnalyzeEvaluateCreateDecision-MakingSelf-Management

Teaching This Topic

Teach this topic by modeling skepticism with your own sharing habits—pause before retweeting, fact-check aloud, and admit when you’re unsure. This normalizes critical evaluation as a daily practice rather than a classroom exercise. Avoid presenting AI deepfakes as the only problem; focus on how ordinary content is weaponized through context stripping and emotional triggers.

What to Expect

Students will identify propaganda techniques in digital content, explain how platform design amplifies messages, and apply critical evaluation strategies before sharing. They should move from recognizing propaganda to actively resisting its spread.

These activities are a starting point. A full mission is the experience.

  • Complete facilitation script with teacher dialogue
  • Printable student materials, ready for class
  • Differentiation strategies for every learner
Generate a Mission

Watch Out for These Misconceptions

Common MisconceptionDuring Collaborative Investigation: Anatomy of a Viral Post, some students will assume that misinformation is always created by bad actors trying to deliberately deceive.

What to Teach Instead

In this activity, direct students to look for evidence of good-faith sharing by examining the comments section for users who clearly believe the claim without malice, then discuss how platform incentives encourage such sharing even without deception.

Common MisconceptionDuring Think-Pair-Share: Before You Share, students may argue that if something has been shared by millions, it must be credible.

What to Teach Instead

Use this activity to have students test the claim by checking engagement metrics on the original post versus fact-checking posts; ask them to calculate the ratio of shares to corrections to reveal virality’s disconnect from accuracy.

Common MisconceptionDuring Whole Class Discussion: Platform Design and Amplification, students often overestimate AI deepfakes as the primary misinformation threat.

What to Teach Instead

Bring real examples of out-of-context images or sensational headlines from their own social feeds to show that most propaganda uses ordinary media manipulated through cropping, captions, or selective framing rather than synthetic media.

Assessment Ideas

Discussion Prompt

After Whole Class Discussion: Platform Design and Amplification, ask students to compare TikTok’s ‘For You’ page algorithm to television commercials using specific examples from their research, then facilitate a discussion where they cite design features that amplify persuasive messages.

Quick Check

During Collaborative Investigation: Anatomy of a Viral Post, provide two contrasting digital texts and ask students to identify at least two propaganda techniques in the persuasive example and explain why the digital format (e.g., shareability, comment sections) increases its impact compared to print.

Exit Ticket

After Think-Pair-Share: Before You Share, ask students to write one way AI could create more effective propaganda in the future and one strategy a digital user could use to critically evaluate such content, then collect responses to identify patterns in their reasoning.

Extensions & Scaffolding

  • Challenge students to create a counter-post that debunks one of the viral examples they analyzed, using the same emotional appeal but factual corrections.
  • Scaffolding: Provide a checklist of specific techniques (e.g., loaded language, false dichotomy) for students to highlight in their posts during Collaborative Investigation.
  • Deeper exploration: Invite a school librarian or local journalist to share how they verify sources, then have students compare their classroom strategies to professional methods.

Key Vocabulary

Algorithmic amplificationThe process by which platform algorithms promote content, often based on engagement metrics, which can increase the visibility of persuasive messages or propaganda.
DisinformationFalse information deliberately created and spread to deceive or mislead an audience, often for political or financial gain.
ViralityThe tendency of content to be rapidly spread and widely shared across the internet, often through social media platforms.
Filter bubbleA state of intellectual isolation that can result from personalized searches and algorithmic filtering, where a user is only exposed to information that confirms their existing beliefs.
DeepfakeA type of synthetic media in which a person in an existing image or video is replaced with someone else's likeness, often created using AI.

Ready to teach Propaganda in the Digital Age?

Generate a full mission with everything you need

Generate a Mission