Skip to content

Bias in Algorithms and DataActivities & Teaching Strategies

Active learning helps students confront bias in algorithms by making abstract concepts tangible. When students role-play flawed hiring systems or audit biased datasets, they see firsthand how data choices shape outcomes.

Year 5Technologies4 activities30 min45 min

Learning Objectives

  1. 1Analyze examples of search engine results or social media feeds to identify how human biases might be reflected.
  2. 2Explain the concept of fairness in data collection, considering how diverse sources and inclusive testing impact outcomes.
  3. 3Critique a given algorithm or dataset for potential biases, proposing specific changes to promote equitable results.
  4. 4Compare the potential impact of biased versus unbiased algorithms on different user groups.

Want a complete lesson plan with these objectives? Generate a Mission

45 min·Small Groups

Role-Play: Biased Hiring Algorithm

Divide class into teams representing job applicants with varied backgrounds. One team codes a simple 'algorithm' using if-then rules on paper that favors certain traits. Groups test it, record unfair outcomes, and redesign for fairness. Discuss results as a class.

Prepare & details

Analyze how human biases can be reflected in technology.

Facilitation Tip: During the Role-Play: Biased Hiring Algorithm activity, assign clear roles and restrict discussion to 10 minutes so students experience decision pressure without losing focus.

Setup: Groups at tables with case materials

Materials: Case study packet (3-5 pages), Analysis framework worksheet, Presentation template

AnalyzeEvaluateCreateDecision-MakingSelf-Management
30 min·Pairs

Data Audit: Spot the Bias

Provide printed datasets on toy preferences by gender. In pairs, students tally imbalances, hypothesize causes, and suggest diverse data additions. Groups share audits on a class chart to visualize patterns.

Prepare & details

Explain the concept of fairness in data collection and algorithm design.

Facilitation Tip: For the Data Audit: Spot the Bias activity, provide a dataset with obvious imbalances so students can practice spotting patterns before tackling subtler cases.

Setup: Groups at tables with case materials

Materials: Case study packet (3-5 pages), Analysis framework worksheet, Presentation template

AnalyzeEvaluateCreateDecision-MakingSelf-Management
35 min·Small Groups

Critique Challenge: Facial Recognition Test

Show short videos of biased facial recognition demos. Individually note failures, then in small groups propose fixes like better training data. Present one improvement per group to the class.

Prepare & details

Critique examples of biased technology and propose improvements.

Facilitation Tip: In the Critique Challenge: Facial Recognition Test activity, remind students to record both technical limits and social impacts, not just accuracy numbers.

Setup: Groups at tables with case materials

Materials: Case study packet (3-5 pages), Analysis framework worksheet, Presentation template

AnalyzeEvaluateCreateDecision-MakingSelf-Management
40 min·Whole Class

Fair Survey Design: Whole Class Poll

As a class, brainstorm survey questions on school lunch preferences. Vote on potentially biased ones, revise for inclusivity, then collect and analyze data. Graph results to check for fairness.

Prepare & details

Analyze how human biases can be reflected in technology.

Setup: Groups at tables with case materials

Materials: Case study packet (3-5 pages), Analysis framework worksheet, Presentation template

AnalyzeEvaluateCreateDecision-MakingSelf-Management

Teaching This Topic

Teach this topic by balancing direct instruction with inquiry. Start with a relatable scenario like search results, then guide students through structured critiques. Avoid overloading with jargon; focus on observable biases in familiar tools. Research shows that when students analyze real datasets and revise their own designs, they grasp bias as a design flaw rather than a technical error.

What to Expect

Students will demonstrate understanding by identifying bias sources, proposing fair solutions, and explaining why neutral design requires intentional effort. Evidence of learning includes debate points, audit notes, and redesign proposals.

These activities are a starting point. A full mission is the experience.

  • Complete facilitation script with teacher dialogue
  • Printable student materials, ready for class
  • Differentiation strategies for every learner
Generate a Mission

Watch Out for These Misconceptions

Common MisconceptionDuring the Role-Play: Biased Hiring Algorithm activity, watch for students assuming the hiring algorithm is fair because it uses data.

What to Teach Instead

After the role-play, pause to compare the role players' results with the actual data inputs they used, highlighting how human choices shaped both the data and the outcome.

Common MisconceptionDuring the Data Audit: Spot the Bias activity, watch for students assuming that larger datasets are always unbiased.

What to Teach Instead

During the audit, ask students to note dataset size alongside missing groups or time periods, then revisit the claim by comparing a large but biased dataset with a smaller, balanced one.

Common MisconceptionDuring the Critique Challenge: Facial Recognition Test activity, watch for students focusing only on technical accuracy.

What to Teach Instead

After the challenge, have students examine the test images for demographic gaps and discuss how those gaps could skew results, using the activity sheet to record their observations.

Assessment Ideas

Exit Ticket

After the Role-Play: Biased Hiring Algorithm activity, ask students to write one sentence explaining how the data they used might have excluded certain groups and one idea for a fairer question to ask in the next round.

Discussion Prompt

During the Data Audit: Spot the Bias activity, prompt students to share one bias they found in the dataset and ask the class to propose a fix before moving to the next task.

Quick Check

After the Critique Challenge: Facial Recognition Test activity, show students two result sets for the same query and ask them to circle the fairer set, then write a sentence explaining their choice using terms from the activity.

Extensions & Scaffolding

  • Challenge: Ask students who finish early to design a survey that intentionally avoids bias and then test it with a small group.
  • Scaffolding: For students struggling with the Data Audit, provide a checklist of common bias types (e.g., missing groups, skewed time periods) to guide their review.
  • Deeper: Invite students to research a real-world algorithm with bias, prepare a short presentation, and suggest data fixes.

Key Vocabulary

AlgorithmA set of rules or instructions followed by a computer to solve a problem or complete a task. Algorithms are used in many technologies, from search engines to video games.
BiasA prejudice or inclination for or against a person, group, or thing, which can unfairly influence the outcome of a decision or process. In technology, bias can come from the data used or how the algorithm is designed.
Fairness in DataEnsuring that the data used to train algorithms represents a wide range of people and situations, avoiding over-representation or under-representation of any group.
Equitable OutcomesResults that are just and impartial, meaning that technology or systems do not unfairly disadvantage or favor any particular group of people.

Ready to teach Bias in Algorithms and Data?

Generate a full mission with everything you need

Generate a Mission