Skip to content

Bias in Data and AlgorithmsActivities & Teaching Strategies

Active learning works because bias in data and algorithms is often invisible until students examine it directly. Students need to see, touch, and question the data and decisions behind biased systems to grasp how human choices shape technology outcomes.

Year 8Technologies4 activities30 min50 min

Learning Objectives

  1. 1Analyze case studies to identify specific instances of algorithmic bias and their discriminatory effects.
  2. 2Explain how human assumptions and incomplete data can embed bias into AI systems.
  3. 3Design a simple data collection plan that incorporates strategies to mitigate potential bias.
  4. 4Evaluate the ethical implications of using biased algorithms in real-world applications.
  5. 5Compare different methods for detecting and addressing bias in datasets.

Want a complete lesson plan with these objectives? Generate a Mission

45 min·Small Groups

Case Study Analysis: Real-World Bias

Provide articles on biased algorithms like COMPAS recidivism tool. In small groups, students identify bias sources, map consequences, and propose fixes. Groups present findings to class for feedback.

Prepare & details

Critique examples of biased algorithms and their real-world consequences.

Facilitation Tip: During Case Study Analysis, ask each group to present one bias they found and one way it could affect people, forcing accountability for their findings.

Setup: Groups at tables with case materials

Materials: Case study packet (3-5 pages), Analysis framework worksheet, Presentation template

AnalyzeEvaluateCreateDecision-MakingSelf-Management
30 min·Pairs

Data Audit Simulation: Spot the Bias

Give mock datasets on job applicants with skewed gender or ethnic data. Pairs analyse for imbalances, calculate representation percentages, and suggest balanced alternatives. Share audits class-wide.

Prepare & details

Explain how unconscious human biases can be embedded into data and AI systems.

Facilitation Tip: In Data Audit Simulation, circulate with a checklist to ensure students justify their bias labels with evidence from the dataset, not just hunches.

Setup: Groups at tables with case materials

Materials: Case study packet (3-5 pages), Analysis framework worksheet, Presentation template

AnalyzeEvaluateCreateDecision-MakingSelf-Management
50 min·Small Groups

Algorithm Flowchart Redesign: Fair Choices

Students receive a biased hiring flowchart. In small groups, they revise it with bias checks at each step, test with sample data, and compare original versus new outcomes.

Prepare & details

Design strategies to mitigate bias in data collection and algorithmic development.

Facilitation Tip: For Algorithm Flowchart Redesign, provide a blank flowchart template and colored markers so students visibly trace and revise decision paths together.

Setup: Groups at tables with case materials

Materials: Case study packet (3-5 pages), Analysis framework worksheet, Presentation template

AnalyzeEvaluateCreateDecision-MakingSelf-Management
40 min·Whole Class

Bias Debate: Pro vs Con Mitigation

Divide class into teams to debate costs versus benefits of bias audits in AI. Each side researches one example, presents arguments, then votes on strongest case.

Prepare & details

Critique examples of biased algorithms and their real-world consequences.

Facilitation Tip: During the Bias Debate, assign roles explicitly and set a strict three-minute speaking limit per side to keep the discussion focused and equitable.

Setup: Groups at tables with case materials

Materials: Case study packet (3-5 pages), Analysis framework worksheet, Presentation template

AnalyzeEvaluateCreateDecision-MakingSelf-Management

Teaching This Topic

Teachers should start with students’ lived experiences by asking where they’ve seen bias in technology, then connect those observations to concrete examples. Avoid abstract lectures; instead, use hands-on activities to build evidence-based skepticism. Research shows that students retain ethical reasoning better when they apply it to real cases, so prioritize analysis over theory.

What to Expect

Successful learning looks like students identifying specific biases, explaining how they entered the system, and proposing clear, actionable fixes. They should move from noticing unfair results to understanding causes and taking steps to reduce harm.

These activities are a starting point. A full mission is the experience.

  • Complete facilitation script with teacher dialogue
  • Printable student materials, ready for class
  • Differentiation strategies for every learner
Generate a Mission

Watch Out for These Misconceptions

Common MisconceptionDuring Case Study Analysis, watch for students who dismiss bias as a technical error rather than a human-made flaw. Redirect them by asking, 'Who made the choices about which data to include? What assumptions did they make?'

What to Teach Instead

During Algorithm Flowchart Redesign, have students annotate each decision point with the developer’s likely assumptions, making the connection between design choices and bias explicit.

Common MisconceptionDuring Data Audit Simulation, watch for students who blame the data itself for bias instead of the collection process. Redirect by asking, 'What was missing from this dataset, and why might that reflect human priorities?'

What to Teach Instead

During Case Study Analysis, ask students to compare two datasets for the same task and identify which one excludes certain groups, showing how data gaps create bias.

Common MisconceptionDuring Algorithm Flowchart Redesign, watch for students who think bias is permanent once coded. Redirect by asking, 'What would happen if we changed this input or added another step?'

What to Teach Instead

During Data Audit Simulation, have students test a small change in the dataset and observe how outputs shift, proving that fixes are possible.

Assessment Ideas

Discussion Prompt

After Case Study Analysis, ask groups to present their findings and respond to peers’ questions about how bias spreads from data to outcomes, assessing their ability to trace causes and effects.

Quick Check

During Data Audit Simulation, collect students’ marked-up datasets and one-sentence explanations of each identified bias, checking for evidence-based reasoning in their work.

Exit Ticket

After the Bias Debate, ask students to write one sentence describing a bias they changed their mind about and one sentence explaining why, capturing shifts in their understanding.

Extensions & Scaffolding

  • Challenge: Ask students to find a biased AI example in the news and present a 2-minute critique using the Data Audit Simulation’s checklist.
  • Scaffolding: Provide sentence starters like, "This bias likely entered when..." to support students who struggle to articulate their findings.
  • Deeper exploration: Invite a local tech professional or ethicist for a Q&A about how biases are addressed in industry practice.

Key Vocabulary

Algorithmic BiasSystematic and repeatable errors in a computer system that create unfair outcomes, such as privileging one arbitrary group of users over others.
DatasetA collection of data, often used to train or test algorithms. Biases can be present in the data itself or how it is collected.
DiscriminationThe unjust or prejudicial treatment of different categories of people or things, especially on the grounds of race, age, or sex, which can be amplified by biased algorithms.
Fairness in AIThe principle that artificial intelligence systems should not create or perpetuate unfair outcomes or discrimination against individuals or groups.
Mitigation StrategyA plan or action taken to reduce the negative impact or severity of a problem, such as bias in algorithms.

Ready to teach Bias in Data and Algorithms?

Generate a full mission with everything you need

Generate a Mission