Skip to content

Sources of Algorithmic BiasActivities & Teaching Strategies

Algorithmic bias is abstract until students see how human choices shape technology outcomes. Active learning works here because students need to trace bias through real systems, not just hear about it. Moving from analysis to mapping to role-play builds concrete understanding and empathy, which helps students recognize bias in systems they will use and build.

9th GradeComputer Science4 activities20 min40 min

Learning Objectives

  1. 1Analyze specific examples to identify how human biases are encoded into algorithmic systems.
  2. 2Compare and contrast data bias and design bias, providing examples of each.
  3. 3Explain the social impact of algorithmic bias in at least two real-world scenarios, such as hiring or loan applications.
  4. 4Critique an algorithm's potential for bias by examining its data sources and intended function.

Want a complete lesson plan with these objectives? Generate a Mission

40 min·Small Groups

Case Study Analysis: Real Algorithmic Bias

Groups each receive one documented case of algorithmic bias (COMPAS, Amazon hiring tool, facial recognition accuracy, predictive policing). Each group identifies the source of bias, the affected group, and the real-world harm. Groups present their case using a shared analysis template, then the class maps patterns across all cases.

Prepare & details

Analyze how human prejudices can be encoded into software and the resulting social impact.

Facilitation Tip: During Case Study Analysis, assign each group a different real-world case so the class collectively sees multiple entry points for bias.

Setup: Groups at tables with case materials

Materials: Case study packet (3-5 pages), Analysis framework worksheet, Presentation template

AnalyzeEvaluateCreateDecision-MakingSelf-Management
20 min·Pairs

Think-Pair-Share: Where Did the Bias Come From?

Show students a short description of a biased AI outcome (e.g., a loan approval system that denies more applications from a particular zip code). Individually, students trace back through the development process to identify at least two points where bias could have entered. Pairs compare their traces, then share with the class.

Prepare & details

Differentiate between various sources of algorithmic bias (e.g., data bias, design bias).

Facilitation Tip: In Think-Pair-Share, ask students to list one human choice in the model’s pipeline before they discuss causes with a partner.

Setup: Standard classroom seating; students turn to a neighbor

Materials: Discussion prompt (projected or printed), Optional: recording sheet for pairs

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
30 min·Small Groups

Bias Source Mapping: From Data to Decision

Using a simplified flowchart of how a hiring algorithm works (data collection, feature selection, model training, threshold setting, deployment), groups annotate each stage with the types of bias that could enter there. The class builds a composite map on the board showing how bias accumulates through a pipeline.

Prepare & details

Explain how algorithmic bias can perpetuate or exacerbate existing inequalities.

Facilitation Tip: For Bias Source Mapping, have students use different colors to mark each stage of the pipeline so links between choices and outcomes are visible.

Setup: Inner circle of 4-6 chairs, outer circle surrounding them

Materials: Discussion prompt or essential question, Observation notes template

AnalyzeEvaluateSocial AwarenessSelf-Awareness
30 min·Individual

Perspective Role-Play: Who Is Harmed?

Students take on roles of people affected by a biased algorithm (loan applicant, job candidate, parolee, medical patient). Each writes a one-paragraph account from their perspective describing the decision they received and why it may be unfair. Class discusses whose perspective is typically absent from algorithmic development teams.

Prepare & details

Analyze how human prejudices can be encoded into software and the resulting social impact.

Facilitation Tip: In Perspective Role-Play, give each student a stakeholder role card with specific concerns to voice, so harm is grounded in lived experience.

Setup: Inner circle of 4-6 chairs, outer circle surrounding them

Materials: Discussion prompt or essential question, Observation notes template

AnalyzeEvaluateSocial AwarenessSelf-Awareness

Teaching This Topic

Teach this topic by making the invisible visible. Start with cases where students can see the people behind the data and the decisions behind the code. Avoid lectures that separate bias from the systems that produce it; instead, connect each stage of the pipeline to human values and trade-offs. Research shows that students grasp bias better when they analyze systems they recognize, so use familiar tools like social media or school apps as examples. Keep the focus on intervention points—where students could change the system—not just on identifying harm.

What to Expect

Students will identify specific human decisions that introduce bias and explain how those decisions lead to unfair outcomes. They will trace bias from data collection through model design to real-world impact. By the end, they should critique systems with evidence, not just opinions.

These activities are a starting point. A full mission is the experience.

  • Complete facilitation script with teacher dialogue
  • Printable student materials, ready for class
  • Differentiation strategies for every learner
Generate a Mission

Watch Out for These Misconceptions

Common MisconceptionDuring Case Study Analysis, watch for students who say algorithms are objective because they use math, not opinions.

What to Teach Instead

During Case Study Analysis, redirect students to the case materials to locate the specific human decisions—like what data was labeled or which metric was optimized—that shaped the algorithm’s behavior and produced biased outcomes.

Common MisconceptionDuring Think-Pair-Share, watch for students who assume bias only comes from biased training data.

What to Teach Instead

During Think-Pair-Share, ask students to review the full algorithm pipeline shown in the activity and identify at least two additional stages where bias could enter, using the case examples as evidence.

Common MisconceptionDuring Perspective Role-Play, watch for students who believe removing sensitive attributes like race or gender fixes bias.

What to Teach Instead

During Perspective Role-Play, have students examine proxy variables in the role-play scenarios and explain how removing one attribute might not remove bias if correlated features remain.

Assessment Ideas

Exit Ticket

After Case Study Analysis, ask students to write one sentence identifying a specific human decision in the case that introduced bias and one sentence explaining how that decision led to unfair outcomes.

Discussion Prompt

After Think-Pair-Share, facilitate a class discussion where students respond to the question: 'If an algorithm is trained on historical data, how can it ever be truly fair?' Encourage them to reference specific types of bias and their real-world consequences from the activity examples.

Quick Check

During Bias Source Mapping, present students with two short case descriptions and ask them to categorize the primary source of bias in each case (data bias or design bias) and briefly justify their choice using the mapping tools.

Extensions & Scaffolding

  • Challenge: Ask students to redesign one part of the algorithm’s pipeline to reduce bias and present their solution with evidence.
  • Scaffolding: Provide partially completed bias source maps for students to fill in or add missing stages.
  • Deeper exploration: Have students research a current algorithmic system (e.g., job ads, credit scoring) and trace its bias sources in a short report with citations.

Key Vocabulary

Algorithmic BiasSystematic and repeatable errors in a computer system that create unfair outcomes, such as privileging one arbitrary group of users over others.
Data BiasBias that occurs when the data used to train an algorithm is not representative of the real world or contains historical prejudices.
Design BiasBias introduced by the choices made by developers when designing an algorithm, including feature selection, objective functions, and evaluation metrics.
Proxy VariableA variable that is correlated with a protected attribute (like race or gender) and can inadvertently lead to discrimination even if the protected attribute itself is not used.

Ready to teach Sources of Algorithmic Bias?

Generate a full mission with everything you need

Generate a Mission