Skip to content
Computer Science · 10th Grade

Active learning ideas

Ethical Considerations in Data Management

Active learning works for ethical data management because students need to apply abstract principles to concrete situations they encounter daily. When they debate policies, audit real documents, and examine case studies, they connect technical details to human impacts in ways that lectures alone cannot.

Common Core State StandardsCSTA: 3A-IC-24CSTA: 3A-IC-25
30–50 minPairs → Whole Class4 activities

Activity 01

Formal Debate50 min · Small Groups

Formal Debate: Data Collection Policy

Present a scenario where a school wants to install AI attendance tracking using facial recognition. Groups are assigned stakeholder roles (students, parents, administrators, civil liberties advocates, technology vendors) and must argue their position in a structured town hall format, then negotiate a policy that addresses the core concerns of each group.

Evaluate the ethical responsibilities of organizations handling personal data.

Facilitation TipDuring the Structured Debate, assign clear roles (proposer, opponent, questioner) to ensure all students engage with the ethical trade-offs of data collection policies.

What to look forPresent students with a scenario: 'A social media company wants to use user posts to train a new AI model for content moderation. What ethical questions should they consider regarding user privacy and data ownership? What steps should they take to ensure informed consent?'

AnalyzeEvaluateCreateSelf-ManagementDecision-Making
Generate Complete Lesson

Activity 02

Inquiry Circle40 min · Small Groups

Inquiry Circle: Privacy Policy Audit

Small groups select a popular app (social media, gaming, educational) and analyze its actual privacy policy against a provided checklist covering data collected, stated purposes, third-party sharing, retention periods, and user rights. Groups report findings to the class and vote on which policy is most and least protective of user interests.

Analyze how data collection practices can infringe on individual privacy.

Facilitation TipFor the Privacy Policy Audit, provide a rubric with specific criteria like 'consent language clarity' and 'data retention limits' to guide students' close reading of real documents.

What to look forAsk students to write down one specific example of data collection they encountered today (e.g., app permission, website cookie). Then, have them explain one potential ethical concern related to that collection and suggest one way to mitigate it.

AnalyzeEvaluateCreateSelf-ManagementSelf-Awareness
Generate Complete Lesson

Activity 03

Think-Pair-Share30 min · Pairs

Think-Pair-Share: The Bias Audit

Present students with a dataset showing demographic disparities in loan approval rates from an algorithmic system. Pairs discuss whether the disparity constitutes bias, what data might have caused it, and whether the company bears responsibility. The class then hears all pairs and builds a shared framework for evaluating algorithmic fairness.

Justify policies that protect user data while enabling beneficial data analysis.

Facilitation TipIn the Think-Pair-Share Bias Audit, give students 2 minutes to individually list biases before pairing to compare notes, then 3 minutes to share with the class.

What to look forProvide students with a short case study about a data breach. Ask them to identify: 1) What type of data was compromised? 2) What were the potential consequences for individuals? 3) What preventative measures could the organization have implemented?

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
Generate Complete Lesson

Activity 04

Gallery Walk45 min · Small Groups

Gallery Walk: Data Ethics Case Studies

Post six real-world data ethics case studies (Cambridge Analytica, health app data sales, predictive policing, credit scoring algorithms, clearview AI, student data brokers). Student groups rotate and annotate each case with the harm caused, who was responsible, and what policy or technical change would have prevented the harm.

Evaluate the ethical responsibilities of organizations handling personal data.

Facilitation TipDuring the Gallery Walk Case Studies, place printed case studies at stations with a focus question like 'Who benefits and who is harmed?' to direct student attention.

What to look forPresent students with a scenario: 'A social media company wants to use user posts to train a new AI model for content moderation. What ethical questions should they consider regarding user privacy and data ownership? What steps should they take to ensure informed consent?'

UnderstandApplyAnalyzeCreateRelationship SkillsSocial Awareness
Generate Complete Lesson

A few notes on teaching this unit

Teachers approach this topic by grounding abstract ethics in students' lived experiences with apps, social media, and school data systems. Avoid presenting data ethics as a purely technical issue; instead, frame it as a civic skill. Research suggests students learn best when they see how bias and surveillance affect people they know, so use local examples whenever possible.

Successful learning looks like students questioning assumptions, citing specific data ethics principles, and proposing actionable solutions rather than reciting definitions. They should be able to articulate trade-offs between utility and privacy, and recognize bias in both data and algorithms.


Watch Out for These Misconceptions

  • During the Structured Debate, watch for students claiming anonymized data is always safe. Redirect them by asking, 'What if someone combines this dataset with public voter records? Could identities still be revealed?'

    During the Privacy Policy Audit, provide students with a real anonymized dataset (e.g., NYC taxi trip data) and have them try to re-identify individuals using supplementary public data like news articles or social media posts.

  • During the Think-Pair-Share Bias Audit, watch for students assuming algorithms are neutral because they use math. Redirect them by asking, 'What goals did the designers prioritize when creating this algorithm? Who might have been left out?'

    During the Gallery Walk Case Studies, display examples of biased algorithmic outcomes (e.g., facial recognition errors, hiring tool discrimination) and have students trace the bias back to training data choices or design decisions in small groups.


Methods used in this brief