Skip to content
Technologies · Year 5

Active learning ideas

Bias in Algorithms and Data

Active learning helps students confront bias in algorithms by making abstract concepts tangible. When students role-play flawed hiring systems or audit biased datasets, they see firsthand how data choices shape outcomes.

ACARA Content DescriptionsAC9TDI6K01
30–45 minPairs → Whole Class4 activities

Activity 01

Case Study Analysis45 min · Small Groups

Role-Play: Biased Hiring Algorithm

Divide class into teams representing job applicants with varied backgrounds. One team codes a simple 'algorithm' using if-then rules on paper that favors certain traits. Groups test it, record unfair outcomes, and redesign for fairness. Discuss results as a class.

Analyze how human biases can be reflected in technology.

Facilitation TipDuring the Role-Play: Biased Hiring Algorithm activity, assign clear roles and restrict discussion to 10 minutes so students experience decision pressure without losing focus.

What to look forProvide students with a scenario: 'A school wants to use an app to recommend extracurricular activities. What are two things they should consider about the app's data to make sure it's fair for all students?'

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

Activity 02

Case Study Analysis30 min · Pairs

Data Audit: Spot the Bias

Provide printed datasets on toy preferences by gender. In pairs, students tally imbalances, hypothesize causes, and suggest diverse data additions. Groups share audits on a class chart to visualize patterns.

Explain the concept of fairness in data collection and algorithm design.

Facilitation TipFor the Data Audit: Spot the Bias activity, provide a dataset with obvious imbalances so students can practice spotting patterns before tackling subtler cases.

What to look forPose the question: 'Imagine you are designing a game that suggests challenges. How could you ensure the suggestions are fair and interesting for players with different skill levels?' Facilitate a brief class discussion, prompting students to share ideas about data and design.

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

Activity 03

Case Study Analysis35 min · Small Groups

Critique Challenge: Facial Recognition Test

Show short videos of biased facial recognition demos. Individually note failures, then in small groups propose fixes like better training data. Present one improvement per group to the class.

Critique examples of biased technology and propose improvements.

Facilitation TipIn the Critique Challenge: Facial Recognition Test activity, remind students to record both technical limits and social impacts, not just accuracy numbers.

What to look forShow students two sets of search results for the same query, one clearly biased and one more neutral. Ask them to write down one sentence explaining why one set might be considered more 'fair' than the other, referencing the data or algorithm.

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

Activity 04

Case Study Analysis40 min · Whole Class

Fair Survey Design: Whole Class Poll

As a class, brainstorm survey questions on school lunch preferences. Vote on potentially biased ones, revise for inclusivity, then collect and analyze data. Graph results to check for fairness.

Analyze how human biases can be reflected in technology.

What to look forProvide students with a scenario: 'A school wants to use an app to recommend extracurricular activities. What are two things they should consider about the app's data to make sure it's fair for all students?'

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

A few notes on teaching this unit

Teach this topic by balancing direct instruction with inquiry. Start with a relatable scenario like search results, then guide students through structured critiques. Avoid overloading with jargon; focus on observable biases in familiar tools. Research shows that when students analyze real datasets and revise their own designs, they grasp bias as a design flaw rather than a technical error.

Students will demonstrate understanding by identifying bias sources, proposing fair solutions, and explaining why neutral design requires intentional effort. Evidence of learning includes debate points, audit notes, and redesign proposals.


Watch Out for These Misconceptions

  • During the Role-Play: Biased Hiring Algorithm activity, watch for students assuming the hiring algorithm is fair because it uses data.

    After the role-play, pause to compare the role players' results with the actual data inputs they used, highlighting how human choices shaped both the data and the outcome.

  • During the Data Audit: Spot the Bias activity, watch for students assuming that larger datasets are always unbiased.

    During the audit, ask students to note dataset size alongside missing groups or time periods, then revisit the claim by comparing a large but biased dataset with a smaller, balanced one.

  • During the Critique Challenge: Facial Recognition Test activity, watch for students focusing only on technical accuracy.

    After the challenge, have students examine the test images for demographic gaps and discuss how those gaps could skew results, using the activity sheet to record their observations.


Methods used in this brief