Skip to content

AI and Privacy ConcernsActivities & Teaching Strategies

Active learning works well for AI and privacy because abstract concepts like data aggregation and legal gaps become concrete when students audit their own devices or draft policies. Hands-on tasks help students confront the reality that privacy trade-offs are personal, not theoretical.

11th GradeComputer Science4 activities25 min40 min

Learning Objectives

  1. 1Analyze the methods AI systems use to collect, aggregate, and process personal data.
  2. 2Explain the business model of surveillance capitalism and its reliance on AI for behavioral profiling.
  3. 3Critique the effectiveness of current US privacy regulations (e.g., COPPA, CCPA) in the context of AI data collection.
  4. 4Design a set of privacy safeguards for a hypothetical AI-driven application.

Want a complete lesson plan with these objectives? Generate a Mission

30 min·Pairs

Personal Data Audit: Your App Permissions

Students examine the permissions requested by five apps on their own devices (or a provided list). They categorize permissions by data type, research what each permission enables the app to collect, and present findings on whether each permission seems necessary for the app's stated function.

Prepare & details

Analyze how AI technologies can impact individual privacy and data security.

Facilitation Tip: During the Personal Data Audit, ask students to screen-record their permission toggles so they see exactly what ‘location always’ or ‘contacts access’ means in practice.

Setup: Chairs arranged in two concentric circles

Materials: Discussion question/prompt (projected), Observation rubric for outer circle

AnalyzeEvaluateCreateSocial AwarenessRelationship Skills

Structured Academic Controversy: Facial Recognition in Schools

Pairs argue for implementing facial recognition for school security, then switch and argue against it on privacy grounds. After both rounds, partners propose a policy framework that addresses both the safety rationale and the privacy risks.

Prepare & details

Explain the concept of 'surveillance capitalism' in the context of AI.

Facilitation Tip: When running the Structured Academic Controversy, assign roles as data subject, school administrator, vendor representative, and policy analyst to force perspective-taking.

Setup: Pairs of desks facing each other

Materials: Position briefs (both sides), Note-taking template, Consensus statement template

AnalyzeEvaluateCreateSocial AwarenessRelationship Skills
25 min·Pairs

Think-Pair-Share: Is Surveillance Capitalism Inevitable?

Present a brief reading on surveillance capitalism. Students write an individual response to whether regulation or alternative business models can change this dynamic, discuss with a partner, then share the most substantive points of disagreement with the class.

Prepare & details

Critique current regulations and propose new safeguards to protect privacy in an AI-driven world.

Facilitation Tip: Use the Think-Pair-Share to first isolate students’ gut reactions about surveillance capitalism before introducing any definitions or data.

Setup: Standard classroom seating; students turn to a neighbor

Materials: Discussion prompt (projected or printed), Optional: recording sheet for pairs

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
40 min·Small Groups

Privacy Policy Drafting Workshop

Small groups are given a fictional app concept and asked to draft a one-page privacy policy that actually explains what data is collected and why. Groups then swap policies and evaluate each other's for clarity and completeness using a provided rubric.

Prepare & details

Analyze how AI technologies can impact individual privacy and data security.

Facilitation Tip: In the Privacy Policy Drafting Workshop, give teams sample student handbooks with redacted clauses so they practice identifying missing protections before writing new ones.

Setup: Chairs arranged in two concentric circles

Materials: Discussion question/prompt (projected), Observation rubric for outer circle

AnalyzeEvaluateCreateSocial AwarenessRelationship Skills

Teaching This Topic

Start with students’ lived experience: have them list every app on their phones before the unit begins. Research shows that concrete, self-relevant examples improve retention of abstract privacy concepts. Avoid lecturing on legal minutiae; instead, let students discover COPPA’s limitations by auditing an app popular with under-13 users. Emphasize that privacy is a continuum, not a binary, and that their own decisions (updating settings, reading policies) shape outcomes.

What to Expect

Students will move from vague worry to specific critique, able to identify what data apps collect, where laws fall short, and how surveillance capitalism shapes their digital lives. They will articulate arguments using examples and propose workable safeguards.

These activities are a starting point. A full mission is the experience.

  • Complete facilitation script with teacher dialogue
  • Printable student materials, ready for class
  • Differentiation strategies for every learner
Generate a Mission

Watch Out for These Misconceptions

Common MisconceptionDuring Personal Data Audit, watch for students who dismiss harmless-looking permissions like ‘camera access’ or ‘storage access’ as inconsequential.

What to Teach Instead

Guide them to trace how seemingly benign permissions combine with location and contact data to infer sensitive traits (e.g., frequent visits to a clinic). Ask them to map the data trail from their phone to AI training datasets.

Common MisconceptionDuring Structured Academic Controversy on facial recognition in schools, listen for claims that COPPA automatically protects all students.

What to Teach Instead

Prompt them to consult the COPPA rule text and note that it covers only under-13 users and apps that knowingly target children, then cite specific gaps they find in their audit data.

Common MisconceptionDuring Think-Pair-Share on surveillance capitalism, expect students to assume private browsing and VPNs guarantee anonymity.

What to Teach Instead

Use the activity to run a live demo: have students visit a site that shows their device fingerprinting score before and after enabling private browsing, then discuss why fingerprinting persists.

Assessment Ideas

Discussion Prompt

After Structured Academic Controversy on facial recognition in schools, facilitate the debate using the prompt: 'Should companies be allowed to collect and sell user data for AI training if they provide a free service?' Ask students to support arguments with examples from their audits and policy drafts.

Quick Check

After Personal Data Audit, present students with a scenario describing a new AI-powered app (e.g., a personalized news aggregator). Ask them to identify three types of personal data the app might collect, one potential privacy risk, and one specific safeguard they would recommend based on their audit findings.

Exit Ticket

During Privacy Policy Drafting Workshop, have students define ‘surveillance capitalism’ in their own words and provide one example from their audit data. They should also list one current US privacy law and explain its limitation regarding AI data collection using their policy draft as evidence.

Extensions & Scaffolding

  • Challenge: Ask students who finish the Privacy Policy Drafting Workshop to revise their draft using CCPA’s opt-out language and compare the two versions side-by-side.
  • Scaffolding: Provide a partially completed data audit table with pre-filled examples of Facebook and TikTok permissions to guide students who struggle with technical jargon.
  • Deeper exploration: Invite a local privacy attorney or technologist for a 20-minute Q&A session where students present their audit findings and policy drafts for real-world feedback.

Key Vocabulary

Personal DataInformation that can be used to identify, locate, or contact an individual, including online identifiers, location data, and biometric information.
Surveillance CapitalismAn economic system centered on the commodification of personal data, where companies collect vast amounts of user information to predict and influence behavior for profit.
Behavioral ProfilingThe process of creating detailed profiles of individuals based on their online activities, preferences, and behaviors, often used for targeted advertising and other purposes.
Data AggregationThe process of collecting and combining data from various sources into a single, unified view, often used by AI systems to build comprehensive user profiles.
Algorithmic BiasSystematic and repeatable errors in an AI system that create unfair outcomes, such as privileging one arbitrary group of users over others.

Ready to teach AI and Privacy Concerns?

Generate a full mission with everything you need

Generate a Mission