Skip to content
Computing · Year 8

Active learning ideas

Ethical AI: Privacy and Surveillance

Active learning works well for this topic because students need to confront their own assumptions about technology while grappling with real-world consequences. When they debate ethical dilemmas or design guidelines, they move beyond abstract concerns to see how privacy and surveillance play out in daily life.

National Curriculum Attainment TargetsKS3: Computing - Societal and Ethical ImpactsKS3: Computing - Digital Literacy
30–50 minPairs → Whole Class4 activities

Activity 01

Philosophical Chairs50 min · Small Groups

Debate Carousel: Safety vs Privacy

Assign small groups roles as citizens, police, or tech firms; they prepare 3 arguments in 10 minutes. Groups rotate stations to debate opponents, recording key points. End with whole-class reflection on compromises.

Evaluate the balance between public safety and individual privacy in AI surveillance systems.

Facilitation TipFor the Debate Carousel, assign half the groups to argue for privacy and half for safety so students must engage with opposing views directly.

What to look forPose the following to small groups: 'Imagine you are designing a new AI-powered security system for your school. What data would it collect, and why? What are the potential privacy risks for students and staff? How would you ensure ethical data handling?' Facilitate a brief class share-out of key concerns and proposed solutions.

AnalyzeEvaluateSelf-AwarenessSocial Awareness
Generate Complete Lesson

Activity 02

Philosophical Chairs35 min · Pairs

Guideline Design Workshop

Pairs list 5 ethical rules for AI surveillance, drawing from unit examples. Pairs pitch to the class for feedback, then revise guidelines collaboratively. Display final sets for ongoing reference.

Critique the ethical implications of AI systems collecting vast amounts of personal data.

Facilitation TipIn the Guideline Design Workshop, provide a template with clear sections so students focus on ethical reasoning rather than design aesthetics.

What to look forProvide students with a scenario: 'An AI system can predict if a person is likely to commit a crime based on their online activity and location data.' Ask them to write: 1) One potential benefit of this system. 2) One significant ethical concern. 3) One guideline they would add to its development.

AnalyzeEvaluateSelf-AwarenessSocial Awareness
Generate Complete Lesson

Activity 03

Jigsaw45 min · Small Groups

Case Study Jigsaw

Divide into expert groups on cases like UK CCTV or social credit systems; research implications for 15 minutes. Reform mixed groups where experts teach, then discuss balanced guidelines.

Design a set of ethical guidelines for the development of AI technologies.

Facilitation TipDuring the Case Study Jigsaw, assign each group a different case study so they bring back varied perspectives to the whole class.

What to look forPresent students with three short statements about AI and privacy (e.g., 'AI surveillance always improves public safety,' 'Personal data collected by AI is always secure,' 'Algorithmic bias is easily fixed'). Ask students to label each statement as 'True' or 'False' and provide a one-sentence justification for one of their choices.

UnderstandAnalyzeEvaluateRelationship SkillsSelf-Management
Generate Complete Lesson

Activity 04

Philosophical Chairs30 min · Pairs

Scenario Role-Play

In pairs, students act out dilemmas like a data breach response or consent request. Switch roles after 5 minutes, then debrief as a class on ethical decisions made.

Evaluate the balance between public safety and individual privacy in AI surveillance systems.

Facilitation TipIn Scenario Role-Play, give students specific roles with conflicting interests to force them to negotiate ethical stances.

What to look forPose the following to small groups: 'Imagine you are designing a new AI-powered security system for your school. What data would it collect, and why? What are the potential privacy risks for students and staff? How would you ensure ethical data handling?' Facilitate a brief class share-out of key concerns and proposed solutions.

AnalyzeEvaluateSelf-AwarenessSocial Awareness
Generate Complete Lesson

A few notes on teaching this unit

Teachers should frame this topic as a series of trade-offs rather than a binary choice between safety and privacy. Research shows that students grasp ethical dilemmas better when they see the human impact behind the technology, so emphasize real cases like biased facial recognition or data breaches in social media. Avoid presenting AI as an all-powerful force; instead, highlight its limitations and the agency of those who design and use it.

Students should demonstrate the ability to articulate trade-offs between safety and privacy, identify bias in AI systems, and propose ethical solutions. Look for nuanced arguments, evidence-based reasoning, and respectful collaboration during discussions.


Watch Out for These Misconceptions

  • During the Scenario Role-Play, some may claim that AI surveillance guarantees perfect public safety.

    Use the flawed detection scenario cards in the role-play to prompt students to share examples where the system flagged the wrong person, then guide them to discuss false positives and their consequences.

  • During the Debate Carousel, students might argue that individual privacy matters less than collective security.

    Have debaters refer to the safety vs privacy data cards provided, which include statistics on false positives and misidentifications, to ground their arguments in evidence rather than assumptions.

  • During the Case Study Jigsaw, students may assume that all personal data collection is harmful.

    Provide case studies that include both harmful and beneficial uses of data, then ask groups to categorise them and explain their reasoning to challenge overgeneralisation.


Methods used in this brief