Skip to content
English Language · JC 1 · AI Governance and Algorithmic Accountability · Semester 1

Surveillance Capitalism and the Ethics of Data Commodification

Learning about digital citizenship, including online safety, privacy, and respectful communication in digital spaces.

MOE Syllabus OutcomesMOE: Media Literacy - Middle School

About This Topic

Surveillance capitalism names the economic system where tech giants extract and commodify personal data to predict and shape human behavior for profit. In JC1 English Language, students dissect this through Shoshana Zuboff's framework, evaluating how platforms like Google and Meta operate beyond traditional markets. Key questions probe whether this demands new regulations, if informed consent truly safeguards users, and if data should be a commodity, civil right, or public good. These align with MOE media literacy standards, honing skills in critical analysis and ethical argumentation.

This topic integrates digital citizenship themes, such as online privacy and respectful digital communication, into persuasive writing and discourse. Students analyze asymmetries in informational power, real-world cases like Cambridge Analytica, and implications for Singapore's Personal Data Protection Act. It builds abilities to construct justified positions, essential for General Paper and essay tasks.

Active learning suits this topic well. Role-plays of platform-user negotiations and structured debates on data framing make ethical complexities vivid. Students gain ownership of arguments through peer feedback, deepening empathy for privacy concerns and sharpening rhetorical precision in a relatable digital context.

Key Questions

  1. Evaluate the claim that surveillance capitalism constitutes a fundamentally new economic logic that operates outside existing frameworks of market accountability and therefore demands new regulatory categories rather than extensions of existing ones.
  2. Analyze the asymmetry of informational power between platforms and users and assess whether informed consent functions as a meaningful safeguard or as a legitimising fiction for the extraction of behavioural data.
  3. Construct a position on whether personal data should be treated as a market commodity, a civil right, or a public good, and justify the legal and political implications that follow from each framing.

Learning Objectives

  • Critique the ethical implications of data commodification as presented in surveillance capitalism.
  • Analyze the power dynamics between technology platforms and individual users regarding data extraction.
  • Evaluate the effectiveness of informed consent as a safeguard in the context of behavioral data collection.
  • Synthesize arguments to construct a justified position on whether personal data should be treated as a commodity, a civil right, or a public good.
  • Compare and contrast existing regulatory frameworks with the proposed need for new categories to govern surveillance capitalism.

Before You Start

Introduction to Digital Citizenship

Why: Students need a foundational understanding of online safety, privacy, and respectful communication to engage with the ethical complexities of data commodification.

Principles of Persuasive Argumentation

Why: This topic requires students to analyze claims and construct their own justified positions, building upon skills in developing logical arguments and using evidence.

Key Vocabulary

Surveillance CapitalismAn economic system centered on the commodification of personal data, extracted through digital platforms to predict and influence user behavior for profit.
Data CommodificationThe process of transforming personal information into a marketable product that can be bought, sold, or traded.
Informational AsymmetryA situation where one party in a transaction or relationship possesses more or better information than the other, creating an imbalance of power.
Behavioral DataInformation collected about a person's actions, habits, and preferences, often gathered through online activity and device usage.
Algorithmic AccountabilityThe principle that algorithms and the systems that deploy them should be transparent, explainable, and subject to mechanisms for redress when they cause harm.

Watch Out for These Misconceptions

Common MisconceptionPrivacy settings give users full control over their data.

What to Teach Instead

Settings are often complex and default to maximum sharing, favoring platforms. Role-plays expose this asymmetry as users 'negotiate' under pressure. Peer discussions help students revise models toward realistic power dynamics.

Common MisconceptionSurveillance capitalism is no different from TV advertising.

What to Teach Instead

It involves behavioral prediction and modification via vast data, not just exposure. Simulations of ad targeting versus behavior nudges clarify the shift. Collaborative mapping activities reveal scale, correcting underestimation.

Common MisconceptionInformed consent always protects users effectively.

What to Teach Instead

Consent is often buried in fine print or presented as binary clicks, functioning as a legitimizing fiction. Debates unpack this, with students citing examples. Active sharing of personal consent experiences builds critical scrutiny.

Active Learning Ideas

See all activities

Real-World Connections

  • The Cambridge Analytica scandal, where personal data from millions of Facebook users was harvested without consent to influence political campaigns, exemplifies the ethical concerns surrounding data commodification.
  • Singapore's Personal Data Protection Act (PDPA) outlines regulations for the collection, use, and disclosure of personal data, reflecting ongoing efforts to balance innovation with individual privacy rights.
  • The business models of companies like Google and Meta, which offer free services in exchange for user data, illustrate the core mechanics of surveillance capitalism and its impact on digital citizenship.

Assessment Ideas

Discussion Prompt

Pose the following to small groups: 'Imagine you are advising the Singaporean government. Based on Shoshana Zuboff's theories, should personal data be regulated as a commodity, a civil right, or a public good? Justify your choice with at least two specific implications for citizens and tech companies.'

Exit Ticket

Ask students to write on an index card: 'One argument for why informed consent is insufficient in surveillance capitalism, and one concrete example of how a tech platform might exploit informational asymmetry.'

Peer Assessment

Students draft a short persuasive paragraph arguing for one of the three data framings (commodity, civil right, public good). They then exchange paragraphs with a partner, providing feedback on the clarity of the argument and the strength of the justification using the rubric provided.

Frequently Asked Questions

What key texts teach surveillance capitalism in JC1 English?
Start with Shoshana Zuboff's 'The Age of Surveillance Capitalism' excerpts for core concepts. Supplement with articles from The Guardian on Cambridge Analytica and Singapore's PDPC guidelines. These provide argumentative fodder, real cases, and local relevance to build balanced positions on ethics and regulation.
How does surveillance capitalism link to MOE media literacy?
It advances critical evaluation of digital media claims, aligning with standards on online safety and privacy. Students assess platform rhetoric, data asymmetries, and ethical framings, skills vital for digital citizenship and respectful communication in JC English tasks.
How can active learning help students grasp data ethics?
Activities like role-plays and debates turn abstract power imbalances into lived experiences, fostering empathy and nuanced views. Jigsaw case studies promote expertise sharing, while data audits personalize risks. These build ownership, improve argumentation, and make ethics memorable beyond lectures.
What regulatory implications arise from treating data as a public good?
Framing data as a public good implies state oversight like public utilities, with mandates for transparency and user ownership. In Singapore, this could extend PDPA toward data trusts. Students explore via debates, weighing innovation stifling against privacy gains for balanced policy positions.