Skip to content
Technologies · Year 8

Active learning ideas

Bias in Data and Algorithms

Active learning works because bias in data and algorithms is often invisible until students examine it directly. Students need to see, touch, and question the data and decisions behind biased systems to grasp how human choices shape technology outcomes.

ACARA Content DescriptionsAC9TDI8K04
30–50 minPairs → Whole Class4 activities

Activity 01

Case Study Analysis45 min · Small Groups

Case Study Analysis: Real-World Bias

Provide articles on biased algorithms like COMPAS recidivism tool. In small groups, students identify bias sources, map consequences, and propose fixes. Groups present findings to class for feedback.

Critique examples of biased algorithms and their real-world consequences.

Facilitation TipDuring Case Study Analysis, ask each group to present one bias they found and one way it could affect people, forcing accountability for their findings.

What to look forPresent students with a hypothetical scenario: 'An AI is designed to recommend news articles. It consistently shows more articles about crime in certain neighborhoods than others.' Ask: 'What types of bias might be at play here? How could this lead to unfair outcomes for residents of those neighborhoods?'

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

Activity 02

Case Study Analysis30 min · Pairs

Data Audit Simulation: Spot the Bias

Give mock datasets on job applicants with skewed gender or ethnic data. Pairs analyse for imbalances, calculate representation percentages, and suggest balanced alternatives. Share audits class-wide.

Explain how unconscious human biases can be embedded into data and AI systems.

Facilitation TipIn Data Audit Simulation, circulate with a checklist to ensure students justify their bias labels with evidence from the dataset, not just hunches.

What to look forProvide students with a short description of a dataset (e.g., 'A dataset of past job applications for software engineers, collected over 20 years, with 90% of successful applicants being male'). Ask them to write one sentence identifying a potential bias and one sentence explaining why it is a problem.

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

Activity 03

Case Study Analysis50 min · Small Groups

Algorithm Flowchart Redesign: Fair Choices

Students receive a biased hiring flowchart. In small groups, they revise it with bias checks at each step, test with sample data, and compare original versus new outcomes.

Design strategies to mitigate bias in data collection and algorithmic development.

Facilitation TipFor Algorithm Flowchart Redesign, provide a blank flowchart template and colored markers so students visibly trace and revise decision paths together.

What to look forAsk students to list one strategy they could use to make data collection more inclusive and one question they would ask a developer about an AI system to check for bias.

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

Activity 04

Case Study Analysis40 min · Whole Class

Bias Debate: Pro vs Con Mitigation

Divide class into teams to debate costs versus benefits of bias audits in AI. Each side researches one example, presents arguments, then votes on strongest case.

Critique examples of biased algorithms and their real-world consequences.

Facilitation TipDuring the Bias Debate, assign roles explicitly and set a strict three-minute speaking limit per side to keep the discussion focused and equitable.

What to look forPresent students with a hypothetical scenario: 'An AI is designed to recommend news articles. It consistently shows more articles about crime in certain neighborhoods than others.' Ask: 'What types of bias might be at play here? How could this lead to unfair outcomes for residents of those neighborhoods?'

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

A few notes on teaching this unit

Teachers should start with students’ lived experiences by asking where they’ve seen bias in technology, then connect those observations to concrete examples. Avoid abstract lectures; instead, use hands-on activities to build evidence-based skepticism. Research shows that students retain ethical reasoning better when they apply it to real cases, so prioritize analysis over theory.

Successful learning looks like students identifying specific biases, explaining how they entered the system, and proposing clear, actionable fixes. They should move from noticing unfair results to understanding causes and taking steps to reduce harm.


Watch Out for These Misconceptions

  • During Case Study Analysis, watch for students who dismiss bias as a technical error rather than a human-made flaw. Redirect them by asking, 'Who made the choices about which data to include? What assumptions did they make?'

    During Algorithm Flowchart Redesign, have students annotate each decision point with the developer’s likely assumptions, making the connection between design choices and bias explicit.

  • During Data Audit Simulation, watch for students who blame the data itself for bias instead of the collection process. Redirect by asking, 'What was missing from this dataset, and why might that reflect human priorities?'

    During Case Study Analysis, ask students to compare two datasets for the same task and identify which one excludes certain groups, showing how data gaps create bias.

  • During Algorithm Flowchart Redesign, watch for students who think bias is permanent once coded. Redirect by asking, 'What would happen if we changed this input or added another step?'

    During Data Audit Simulation, have students test a small change in the dataset and observe how outputs shift, proving that fixes are possible.


Methods used in this brief