Skip to content
Computer Science · 11th Grade

Active learning ideas

AI and Privacy Concerns

Active learning works well for AI and privacy because abstract concepts like data aggregation and legal gaps become concrete when students audit their own devices or draft policies. Hands-on tasks help students confront the reality that privacy trade-offs are personal, not theoretical.

Common Core State StandardsCSTA: 3B-IC-24CSTA: 3B-IC-25
25–40 minPairs → Whole Class4 activities

Activity 01

Socratic Seminar30 min · Pairs

Personal Data Audit: Your App Permissions

Students examine the permissions requested by five apps on their own devices (or a provided list). They categorize permissions by data type, research what each permission enables the app to collect, and present findings on whether each permission seems necessary for the app's stated function.

Analyze how AI technologies can impact individual privacy and data security.

Facilitation TipDuring the Personal Data Audit, ask students to screen-record their permission toggles so they see exactly what ‘location always’ or ‘contacts access’ means in practice.

What to look forFacilitate a class debate using the prompt: 'Should companies be allowed to collect and sell user data for AI training if they provide a free service?' Ask students to support their arguments with specific examples of AI applications and privacy implications.

AnalyzeEvaluateCreateSocial AwarenessRelationship Skills
Generate Complete Lesson

Activity 02

Structured Academic Controversy: Facial Recognition in Schools

Pairs argue for implementing facial recognition for school security, then switch and argue against it on privacy grounds. After both rounds, partners propose a policy framework that addresses both the safety rationale and the privacy risks.

Explain the concept of 'surveillance capitalism' in the context of AI.

Facilitation TipWhen running the Structured Academic Controversy, assign roles as data subject, school administrator, vendor representative, and policy analyst to force perspective-taking.

What to look forPresent students with a scenario describing a new AI-powered app (e.g., a personalized news aggregator). Ask them to identify 2-3 types of personal data the app might collect, 1 potential privacy risk associated with that data, and 1 specific privacy safeguard they would recommend.

AnalyzeEvaluateCreateSocial AwarenessRelationship Skills
Generate Complete Lesson

Activity 03

Think-Pair-Share25 min · Pairs

Think-Pair-Share: Is Surveillance Capitalism Inevitable?

Present a brief reading on surveillance capitalism. Students write an individual response to whether regulation or alternative business models can change this dynamic, discuss with a partner, then share the most substantive points of disagreement with the class.

Critique current regulations and propose new safeguards to protect privacy in an AI-driven world.

Facilitation TipUse the Think-Pair-Share to first isolate students’ gut reactions about surveillance capitalism before introducing any definitions or data.

What to look forOn an index card, have students define 'surveillance capitalism' in their own words and provide one example of how AI amplifies this business model. They should also list one current US privacy law and briefly explain its limitation regarding AI data collection.

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
Generate Complete Lesson

Activity 04

Socratic Seminar40 min · Small Groups

Privacy Policy Drafting Workshop

Small groups are given a fictional app concept and asked to draft a one-page privacy policy that actually explains what data is collected and why. Groups then swap policies and evaluate each other's for clarity and completeness using a provided rubric.

Analyze how AI technologies can impact individual privacy and data security.

Facilitation TipIn the Privacy Policy Drafting Workshop, give teams sample student handbooks with redacted clauses so they practice identifying missing protections before writing new ones.

What to look forFacilitate a class debate using the prompt: 'Should companies be allowed to collect and sell user data for AI training if they provide a free service?' Ask students to support their arguments with specific examples of AI applications and privacy implications.

AnalyzeEvaluateCreateSocial AwarenessRelationship Skills
Generate Complete Lesson

A few notes on teaching this unit

Start with students’ lived experience: have them list every app on their phones before the unit begins. Research shows that concrete, self-relevant examples improve retention of abstract privacy concepts. Avoid lecturing on legal minutiae; instead, let students discover COPPA’s limitations by auditing an app popular with under-13 users. Emphasize that privacy is a continuum, not a binary, and that their own decisions (updating settings, reading policies) shape outcomes.

Students will move from vague worry to specific critique, able to identify what data apps collect, where laws fall short, and how surveillance capitalism shapes their digital lives. They will articulate arguments using examples and propose workable safeguards.


Watch Out for These Misconceptions

  • During Personal Data Audit, watch for students who dismiss harmless-looking permissions like ‘camera access’ or ‘storage access’ as inconsequential.

    Guide them to trace how seemingly benign permissions combine with location and contact data to infer sensitive traits (e.g., frequent visits to a clinic). Ask them to map the data trail from their phone to AI training datasets.

  • During Structured Academic Controversy on facial recognition in schools, listen for claims that COPPA automatically protects all students.

    Prompt them to consult the COPPA rule text and note that it covers only under-13 users and apps that knowingly target children, then cite specific gaps they find in their audit data.

  • During Think-Pair-Share on surveillance capitalism, expect students to assume private browsing and VPNs guarantee anonymity.

    Use the activity to run a live demo: have students visit a site that shows their device fingerprinting score before and after enabling private browsing, then discuss why fingerprinting persists.


Methods used in this brief