Skip to content
Computer Science · 9th Grade

Active learning ideas

Ethical Implications of Algorithmic Predictions

Active learning works for this topic because algorithmic bias is abstract until students see it in real data. Case studies and debates make the ethical stakes tangible, helping students move from vague concerns to concrete analysis of how human choices shape algorithmic decisions.

Common Core State StandardsCSTA: 3A-DA-12CSTA: 3A-IC-24
25–40 minPairs → Whole Class3 activities

Activity 01

Case Study Analysis40 min · Small Groups

Case Study Analysis: The COMPAS Recidivism Tool

Small groups read a summary of the ProPublica analysis of the COMPAS risk-scoring algorithm used in US courts. Groups identify: what the algorithm was designed to do, what ProPublica found it actually did, and who was harmed. Groups present their analysis and the class votes on whether the system should continue to be used.

Analyze the dangers of over-relying on algorithmic predictions for social issues.

Facilitation TipDuring the COMPAS case study, have students annotate the ProPublica article with three types of human decisions: data selection, model design, and fairness definition.

What to look forPose the question: 'Imagine an algorithm is used to decide which students receive extra academic support. What kinds of data might be used, and how could that data lead to unfair outcomes for certain students?' Facilitate a class discussion, guiding students to identify potential biases.

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

Activity 02

Formal Debate35 min · Whole Class

Formal Debate: Should Algorithms Make Hiring Decisions?

Half the class argues for algorithmic hiring screening; the other half argues against. Each side has five minutes to prepare, three minutes to present, and two minutes for rebuttal. After the debate, students individually write a one-paragraph position statement that acknowledges the strongest argument on the opposing side.

Critique the use of predictive algorithms in sensitive areas like criminal justice or hiring.

Facilitation TipIn the hiring debate, assign roles explicitly so students speak from the perspective of job seekers, hiring managers, and data scientists to surface conflicting values.

What to look forProvide students with a brief scenario about an algorithm used for college admissions. Ask them to write two sentences explaining one potential ethical concern and one sentence suggesting a way to mitigate that concern.

AnalyzeEvaluateCreateSelf-ManagementDecision-Making
Generate Complete Lesson

Activity 03

Think-Pair-Share25 min · Pairs

Think-Pair-Share: Fairness Definitions Clash

Present two definitions of algorithmic fairness that are mathematically incompatible: equal error rates across groups vs. equal prediction accuracy. Students individually decide which definition they think is more fair and why. Partners share, then the class discusses why there is no universally correct answer.

Predict the societal consequences of biased algorithmic predictions.

Facilitation TipFor the fairness definitions activity, provide a chart of four fairness metrics and ask pairs to plot where their intuitions align or clash before sharing with the class.

What to look forPresent students with a short, anonymized case study of an algorithmic decision (e.g., loan denial). Ask them to identify the potential source of bias in 1-2 sentences and state whether the outcome seems fair. Collect responses to gauge understanding.

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
Generate Complete Lesson

A few notes on teaching this unit

Teach this topic by foregrounding the human choices embedded in every algorithmic system. Avoid presenting math as neutral; instead, show how error metrics, datasets, and thresholds embed values. Use jargon like ‘disparate impact’ only after students have grappled with concrete cases where fairness is contested.

Successful learning looks like students tracing human choices in algorithm design, identifying fairness trade-offs, and articulating why data alone does not guarantee objectivity. They should use fairness metrics to critique outcomes, not just technical accuracy.


Watch Out for These Misconceptions

  • During the COMPAS case study analysis, watch for the claim that algorithms are objective because they use math and data, not human opinions.

    During the COMPAS case study analysis, have students trace the human choices in the article: ask them to list what data ProPublica used, what outcome the algorithm predicted, and which fairness definition was implicitly assumed by the tool’s designers.

  • During the hiring debate, watch for the assumption that a highly accurate algorithm is a fair one.

    During the hiring debate, require students to present accuracy rates by demographic group and ask them to define fairness before evaluating the model’s trade-offs.


Methods used in this brief