Skip to content
Computer Science · 12th Grade · Social Impacts and Professional Ethics · Weeks 37-45

Impact of Social Media and Digital Citizenship

Students analyze the societal impact of social media, focusing on issues like misinformation, online harassment, and digital citizenship.

Common Core State StandardsCSTA: 3B-IC-26CSTA: 3B-IC-27

About This Topic

Social media has reshaped how information spreads, how communities form, and how political discourse unfolds across the United States. For 12th-grade computer science students, this topic connects technical systems thinking with real-world consequences: algorithms that prioritize engagement over accuracy, platform policies that shape what speech is amplified or suppressed, and the personal responsibilities users carry in networked environments.

Under the CSTA 3B-IC-26 and 3B-IC-27 standards, students are expected to evaluate the impacts of computing on society and advocate for responsible use. That means examining documented cases of misinformation spread during elections, public health crises, and civil unrest, then analyzing how platforms have responded with content moderation, fact-checking partnerships, and algorithmic changes.

Active learning works especially well here because students bring strong personal experience with social media, which creates genuine tension between lived intuition and critical analysis. Structured discussions, case study breakdowns, and collaborative policy drafting push students to move past surface opinions toward evidence-based arguments about digital citizenship at scale.

Key Questions

  1. How does social media influence political discourse and public opinion?
  2. Critique the role of social media platforms in combating misinformation and hate speech.
  3. Design a set of guidelines for responsible digital citizenship in an interconnected world.

Learning Objectives

  • Analyze the algorithmic amplification of specific content types on major social media platforms.
  • Evaluate the effectiveness of content moderation policies in mitigating online harassment and misinformation.
  • Critique the ethical responsibilities of social media companies regarding user data and platform integrity.
  • Design a digital citizenship pledge outlining best practices for online communication and information sharing.
  • Synthesize research findings on the psychological impacts of social media use on adolescent populations.

Before You Start

Introduction to Algorithms and Data Structures

Why: Understanding how algorithms function is foundational to analyzing their impact on content amplification and user experience on social media.

Ethical Considerations in Computing

Why: Students need a baseline understanding of ethical frameworks to evaluate the responsibilities of technology creators and users.

Key Vocabulary

Algorithmic BiasSystematic and repeatable errors in a computer system that create unfair outcomes, such as prioritizing sensational content for engagement.
MisinformationFalse or inaccurate information that is spread, regardless of intent to deceive. This can include unintentional errors or deliberate falsehoods.
DisinformationFalse information deliberately and strategically disseminated to manipulate public opinion, cause harm, or achieve political goals.
Content ModerationThe process by which social media platforms review and manage user-generated content to ensure it complies with community guidelines and legal requirements.
Digital FootprintThe trail of data a user leaves behind when interacting with digital services, including websites visited, emails sent, and social media posts.

Watch Out for These Misconceptions

Common MisconceptionSocial media platforms are neutral tools , the harm comes entirely from bad users, not the platform itself.

What to Teach Instead

Platforms are not passive pipes. Algorithmic design decisions , what to amplify, what to suppress, how to surface outrage-driven content , are engineering choices with measurable social effects. Active case analysis helps students see the technical layer beneath the social harm.

Common MisconceptionContent moderation is a solved problem if you just remove anything harmful.

What to Teach Instead

Defining 'harmful' is contested across legal systems, cultures, and political contexts. Over-moderation silences legitimate speech; under-moderation enables coordinated abuse. Structured debate in class makes this trade-off concrete rather than abstract.

Common MisconceptionDigital citizenship is mostly about being nice online.

What to Teach Instead

Digital citizenship at the 12th-grade level includes understanding how attention economies work, how to evaluate source credibility at scale, how to report and escalate harm through platform mechanisms, and how local actions contribute to systemic outcomes.

Active Learning Ideas

See all activities

Jigsaw: Platform Policy Comparison

Divide students into expert groups, each researching one platform's approach to misinformation and hate speech moderation (e.g., X/Twitter, Meta, YouTube, TikTok). Groups then reassemble in mixed teams to compare policies, identify gaps, and surface trade-offs between free expression and harm reduction.

55 min·Small Groups

Socratic Seminar: Algorithmic Amplification

Students read a short briefing on engagement-optimized recommendation algorithms and then participate in a structured discussion: Does a platform bear moral responsibility for content its algorithm surfaces? The teacher facilitates but does not intervene , students must build on each other's reasoning and cite evidence.

40 min·Whole Class

Think-Pair-Share: Digital Citizenship Scenario Analysis

Present pairs with a specific scenario: a peer sharing unverified health information, a group chat spreading a doctored image, or an anonymous account harassing a classmate. Each pair identifies the technical mechanisms involved, the ethical failure, and a concrete response aligned with digital citizenship principles, then shares with the class.

25 min·Pairs

Design Workshop: Community Digital Citizenship Guidelines

Small groups draft a practical digital citizenship guide for a specific community (a school, a gaming community, a local news comment section). Groups must address misinformation, harassment, privacy, and attribution. Guides are posted and peer-reviewed using a rubric that checks specificity, enforceability, and ethical grounding.

60 min·Small Groups

Real-World Connections

  • Journalists at major news organizations like The New York Times and the Associated Press now employ social media analysts to track the spread of breaking news and identify potential misinformation during critical events such as elections or public health emergencies.
  • Cybersecurity firms, such as Mandiant or CrowdStrike, investigate coordinated disinformation campaigns launched by state-sponsored actors or malicious groups, often tracing their origins back to specific social media platforms and tactics.
  • Public health officials at the Centers for Disease Control and Prevention (CDC) actively monitor social media for emerging health trends and to counter health-related misinformation during outbreaks, like the COVID-19 pandemic.

Assessment Ideas

Discussion Prompt

Pose the following question to students: 'Consider a recent viral news story. How might the algorithms of platforms like TikTok or X (formerly Twitter) have influenced its spread and the public's perception of it? What specific content moderation challenges might have arisen?'

Quick Check

Provide students with a short, anonymized case study of an online harassment incident. Ask them to identify: 1. The specific digital citizenship principles violated. 2. At least two potential actions the platform could have taken. 3. One action the user could have taken to mitigate the situation.

Peer Assessment

Students draft a personal digital citizenship pledge. They then exchange pledges with a partner. Each partner evaluates the pledge based on clarity, comprehensiveness (covering at least three key areas like privacy, respectful communication, and information verification), and feasibility. Partners provide one specific suggestion for improvement.

Frequently Asked Questions

How does social media influence political discourse and public opinion
Social media platforms shape political discourse through algorithmic curation that prioritizes engagement, which tends to amplify emotionally charged content. Filter bubbles expose users primarily to views they already hold, reducing exposure to counter-arguments. Research consistently links heavy social media use with increased political polarization, though the relationship between exposure and belief change is more complex.
What is digital citizenship and why does it matter in high school CS
Digital citizenship covers the rights and responsibilities of people participating in online environments: evaluating information credibility, protecting personal data, engaging respectfully across difference, and understanding how platforms shape behavior. In 12th-grade CS, it connects abstract ethics standards to systems students use daily, making the stakes concrete.
How do social media platforms try to reduce misinformation
Platforms use a mix of approaches: automated detection models trained on flagged content, third-party fact-checking partnerships (like PolitiFact or AFP), reduced algorithmic distribution of disputed content, user reporting systems, and account-level restrictions for repeat violators. Each approach has documented limitations and trade-offs between accuracy and free expression.
How can active learning help students think critically about social media
Active learning , structured debates, case study analysis, policy drafting , forces students to construct arguments rather than absorb talking points. When students must defend a position, cite evidence, and respond to peer challenges, they develop the critical evaluation habits needed to assess social media claims in real time, outside the classroom.