Skip to content
Impacts of Computing and Emerging Tech · Semester 2

Social Media and Information Integrity

Analyzing the impact of algorithms on public discourse, filter bubbles, and misinformation.

Need a lesson plan for Computing?

Generate Mission

Key Questions

  1. How do recommendation engines shape our perception of reality?
  2. What role should tech companies play in moderating online content?
  3. How does the attention economy affect the mental health of users?

MOE Syllabus Outcomes

MOE: Impacts of Computing and Emerging Tech - JC1
Level: JC 1
Subject: Computing
Unit: Impacts of Computing and Emerging Tech
Period: Semester 2

About This Topic

Social Media and Information Integrity focuses on how algorithms shape online experiences, creating filter bubbles that limit diverse viewpoints and amplify misinformation. JC 1 students analyze recommendation engines, which prioritize content based on past interactions to maximize engagement in the attention economy. This leads to distorted perceptions of reality, polarization in public discourse, and risks to mental health from constant exposure to extreme content. Key questions probe how these systems influence beliefs, the moderation duties of tech companies, and broader societal impacts.

Within the MOE Computing curriculum's Impacts of Computing and Emerging Tech unit, this topic builds critical evaluation skills alongside ethical awareness. Students apply concepts to local contexts, such as Singapore's general elections or COVID-19 myths, connecting algorithmic choices to real-world consequences like trust erosion in institutions.

Active learning suits this topic well. Students gain deeper insight through simulations of recommendation systems, collaborative fact-checking of viral posts, and debates on platform responsibilities. These methods make invisible algorithms tangible, encourage peer challenge of biases, and foster habits of digital discernment.

Learning Objectives

  • Analyze how algorithmic personalization on social media platforms contributes to the formation of filter bubbles.
  • Evaluate the ethical responsibilities of technology companies in moderating online content and combating misinformation.
  • Critique the impact of the attention economy on user mental health and public discourse.
  • Compare the effectiveness of different digital literacy strategies in identifying and refuting online misinformation.
  • Synthesize findings on algorithmic bias to propose design improvements for more equitable information dissemination.

Before You Start

Introduction to Digital Citizenship

Why: Students need a foundational understanding of responsible online behavior and critical evaluation of digital information before analyzing complex issues like algorithmic impact.

Basic Concepts of Data and Information

Why: Understanding how data is collected and processed is essential for grasping how algorithms function and personalize content.

Key Vocabulary

AlgorithmA set of rules or instructions followed by a computer to solve a problem or perform a task, often used by social media to curate content.
Filter BubbleA state of intellectual isolation that can result from personalized searches and content feeds, where users are primarily exposed to information that confirms their existing beliefs.
MisinformationFalse or inaccurate information, especially that which is deliberately intended to deceive or mislead.
Attention EconomyA business model that treats human attention as a scarce commodity, optimizing platforms to capture and retain user engagement.
Echo ChamberA metaphorical description of a situation where information, ideas, or beliefs are amplified or reinforced by communication and repetition inside a defined system, often leading to a closed-off perspective.

Active Learning Ideas

See all activities

Real-World Connections

Journalists at The Straits Times use fact-checking tools and media literacy frameworks to verify viral news stories circulating on platforms like Facebook and WhatsApp, especially during election periods.

Public health officials in Singapore analyze social media trends to identify and counter health-related misinformation, such as myths about vaccine efficacy spread during the COVID-19 pandemic.

UX designers at tech companies like Google and Meta grapple with designing recommendation systems that balance user engagement with the potential for creating filter bubbles and spreading harmful content.

Watch Out for These Misconceptions

Common MisconceptionAlgorithms deliver neutral, objective content based on facts.

What to Teach Instead

Algorithms rank by predicted engagement, favoring sensationalism over accuracy. Hands-on simulations let students tweak rules and watch biases emerge, clarifying how 'relevance' scores distort feeds. Peer reviews reinforce this shift from assumption to evidence.

Common MisconceptionFilter bubbles only trap unaware users; informed people escape them.

What to Teach Instead

Everyone experiences personalized curation that narrows worldviews. Personal audits reveal hidden patterns, while group shares highlight collective blind spots. This builds empathy and motivates proactive feed diversification.

Common MisconceptionMisinformation spreads solely due to malicious posters.

What to Teach Instead

Algorithms amplify it through virality mechanics. Tracing real cascades in class challenges shows neutral systems boosting falsehoods. Collaborative analysis helps students prioritize platform design over individual blame.

Assessment Ideas

Discussion Prompt

Pose the question: 'If a social media platform removes a piece of content that some users find valuable but others find harmful, what are the ethical trade-offs?' Facilitate a class debate, asking students to cite specific examples of content moderation decisions and their consequences.

Quick Check

Present students with a hypothetical social media feed designed to create a filter bubble. Ask them to identify three specific types of content that are likely missing from the feed and explain why their absence might be problematic.

Exit Ticket

Students write down one strategy they can personally use to break out of a filter bubble and one question they would ask a social media platform's CEO about content moderation policies.

Ready to teach this topic?

Generate a complete, classroom-ready active learning mission in seconds.

Generate a Custom Mission

Frequently Asked Questions

How do recommendation algorithms create filter bubbles?
Recommendation algorithms analyze user likes, views, and shares to suggest similar content, gradually narrowing exposure to agreeing viewpoints. Over time, this forms echo chambers where opposing ideas rarely appear, reinforcing biases. In Singapore contexts, this intensified during elections, limiting nuanced policy discussions amid targeted ads.
What is the attention economy and its risks?
The attention economy treats user time as a commodity, with platforms designing addictive features like infinite scrolls to compete for views. This boosts revenue but harms mental health through anxiety from outrage content and reduced focus. Students see links to rising youth screen addiction stats in local surveys.
What role should tech companies play in content moderation?
Tech companies must balance free expression with harm prevention by using AI flags, human reviewers, and transparency reports. In Singapore, platforms comply with POFMA directives for fake news. Debates help students weigh over-moderation risks against unchecked misinformation's societal costs.
How can active learning improve understanding of social media integrity?
Active learning engages students through simulations where they build mock algorithms, revealing engagement biases firsthand. Fact-checking relays and debates on moderation build verification skills and ethical judgment. These approaches make abstract concepts personal, boost retention via collaboration, and equip students to navigate platforms critically in daily life.