Skip to content
English Language Arts · 9th Grade · The Art of Persuasion and Rhetoric · Weeks 1-9

Propaganda in the Digital Age

Analyzing how the internet and social media have changed the speed, reach, and forms of persuasive messaging and propaganda.

Common Core State StandardsCCSS.ELA-LITERACY.RI.9-10.7CCSS.ELA-LITERACY.SL.9-10.2

About This Topic

The internet and social media have not invented propaganda, but they have radically changed how it spreads. Algorithms amplify emotionally charged content, producing persuasive media requires no professional training, and the blurring of personal and institutional sources makes origin verification difficult. For 9th grade ELA students, this topic extends historical propaganda analysis to the contemporary information environment, building the media literacy skills called for in CCSS.ELA-LITERACY.RI.9-10.7 and SL.9-10.2.

A central concept is the role of platform design in propaganda's reach. The features that make social media engaging, infinite scroll, emotional reaction buttons, and algorithmic recommendation, also amplify content that triggers strong emotional responses, which is exactly what effective propaganda is designed to do. Students who understand this structural dynamic can evaluate content more carefully and make more intentional choices about what they share and amplify.

This topic works especially well with collaborative research and structured discussion because the examples are current and varied. Student experience is a genuine analytical resource here: many 9th graders have encountered viral misinformation firsthand, and structuring that experience into analytical frameworks is more productive in a group setting than in isolation.

Key Questions

  1. How has the internet changed the speed and reach of persuasive messaging?
  2. Critique the effectiveness of modern digital propaganda compared to historical examples.
  3. Predict the future impact of AI on the creation and dissemination of persuasive content.

Learning Objectives

  • Analyze how specific digital platform features, such as algorithms and infinite scroll, amplify persuasive messaging.
  • Compare the reach and speed of modern digital propaganda to historical examples using case studies.
  • Evaluate the credibility of digital persuasive content by identifying common propaganda techniques and logical fallacies.
  • Predict the potential impact of AI-generated content on the future landscape of persuasive messaging.

Before You Start

Identifying Persuasive Techniques

Why: Students need foundational knowledge of rhetorical appeals (ethos, pathos, logos) and common persuasive strategies to analyze their digital manifestations.

Evaluating Source Credibility

Why: Understanding how to assess the reliability and bias of information sources is crucial before analyzing the complexities of digital propaganda.

Key Vocabulary

Algorithmic amplificationThe process by which platform algorithms promote content, often based on engagement metrics, which can increase the visibility of persuasive messages or propaganda.
DisinformationFalse information deliberately created and spread to deceive or mislead an audience, often for political or financial gain.
ViralityThe tendency of content to be rapidly spread and widely shared across the internet, often through social media platforms.
Filter bubbleA state of intellectual isolation that can result from personalized searches and algorithmic filtering, where a user is only exposed to information that confirms their existing beliefs.
DeepfakeA type of synthetic media in which a person in an existing image or video is replaced with someone else's likeness, often created using AI.

Watch Out for These Misconceptions

Common MisconceptionMisinformation is always created by bad actors trying to deliberately deceive.

What to Teach Instead

Research shows that a significant portion of online misinformation spreads because real people share content they genuinely believe is true without checking. The structural incentives of social platforms, speed, emotional reward, and social approval, encourage sharing before verifying. Framing misinformation as a systemic problem rather than purely a moral failure helps students address their own habits without becoming defensive.

Common MisconceptionIf something has been shared by millions of people, it must be credible or at least partially true.

What to Teach Instead

Virality measures engagement, not accuracy. Emotionally provocative content spreads faster regardless of its factual content, and corrective information typically receives far less engagement than the original false claim. Students who conflate reach with reliability are especially vulnerable to bandwagon propaganda techniques in digital environments.

Common MisconceptionAI deepfakes and synthetic media are the primary misinformation threat students need to worry about.

What to Teach Instead

While AI-generated synthetic media is a growing concern, research consistently shows that most effective misinformation involves real images and videos taken out of context, or plausible-sounding but unverified text. The more ordinary and prevalent threat is easier to address with basic source-checking habits, which are also more immediately teachable in a 9th grade classroom.

Active Learning Ideas

See all activities

Real-World Connections

  • Political campaigns utilize targeted social media advertising, leveraging platform data to deliver persuasive messages directly to specific demographics, influencing voter behavior during elections like the US presidential race.
  • Public health organizations, such as the CDC, must combat health misinformation that spreads rapidly online, using social media to disseminate accurate information about vaccines or disease outbreaks to counter false narratives.
  • Journalists and fact-checkers at organizations like the Associated Press or Reuters work to verify information circulating on platforms like X (formerly Twitter) and TikTok, identifying and debunking propaganda to maintain an informed public.

Assessment Ideas

Discussion Prompt

Pose the question: 'How do the design features of TikTok, like its 'For You' page algorithm, contribute to the spread of persuasive messages or propaganda compared to older media like television commercials?' Facilitate a class discussion where students cite specific examples.

Quick Check

Provide students with two short, contrasting digital texts or images, one clearly persuasive and one neutral. Ask them to identify at least two specific techniques used in the persuasive example and explain why the digital format (e.g., shareability, comment section) might make it more impactful than a print equivalent.

Exit Ticket

Ask students to write down one way AI could be used to create more effective propaganda in the future and one strategy a digital user could employ to critically evaluate such content.

Frequently Asked Questions

How has social media changed the spread of propaganda and misinformation?
Social media reduces the cost of producing and distributing persuasive content to near zero, removes gatekeepers like editors and fact-checkers, and provides micro-targeting tools that allow messages to be tailored to specific psychological profiles. It also creates feedback loops where users are served more of what they already agree with, making propaganda more effective at reinforcing existing beliefs than at persuading people who hold different views.
What is a filter bubble and why does it matter for understanding propaganda?
A filter bubble is the algorithmically curated information environment that forms when platforms show users content they are most likely to engage with. Over time, users primarily encounter information confirming their existing views, which makes them more receptive to propaganda aligned with those views and more skeptical of information that challenges them, regardless of which is more accurate. Recognizing this structural feature is part of being a critical consumer of digital media.
What are signs that something I see online might be propaganda or misinformation?
Ask: Who made this, and do they have a clear agenda? Does this content trigger a strong emotional reaction that makes me want to share immediately? Does it present one side without acknowledging any complexity? Can I find the same claim verified by a credible independent source? If a piece of content resists answering these questions clearly, that is a reliable signal to slow down and verify before sharing.
How does active learning help students analyze digital propaganda?
Students encounter digital misinformation in their actual lives, which means their personal experience is a genuine analytical resource. Structured collaborative activities like SIFT exercises and before-you-share discussions turn lived experience into transferable verification skills. When students compare their verification processes in a group, they develop lateral reading habits and source-checking instincts that individual instruction rarely produces as reliably or as durably.

Planning templates for English Language Arts