Skip to content
Computing · Year 6

Active learning ideas

Bias and Fairness in AI

Active learning works for Bias and Fairness in AI because students need to experience bias firsthand to understand it. When children sort datasets or simulate AI rules, they see how human choices shape technology, making abstract ideas concrete and memorable.

National Curriculum Attainment TargetsKS2: Computing - Digital LiteracyKS2: Computing - Online Safety
25–45 minPairs → Whole Class4 activities

Activity 01

Formal Debate35 min · Small Groups

Group Sort: Spotting Dataset Bias

Give small groups printed cards with images or profiles representing a hiring dataset. Students sort into 'hire' or 'not hire' piles, then discuss imbalances like gender or ethnicity skews. Groups redesign the dataset for fairness and predict improved AI outcomes.

Analyze how a computer program can inherit biases from its creators or training data.

Facilitation TipDuring Group Sort, circulate with a checklist to note which students quickly spot underrepresented groups or labels, reinforcing early those who hesitate.

What to look forPresent students with a scenario: 'An AI is designed to recommend books. It was trained only on books written by male authors. What kind of bias might this AI show? How could we make it fairer?' Facilitate a class discussion, guiding them to identify the source of bias and brainstorm solutions.

AnalyzeEvaluateCreateSelf-ManagementDecision-Making
Generate Complete Lesson

Activity 02

Formal Debate40 min · Pairs

Pairs Simulation: Rule-Based AI

Pairs create a simple paper-based 'AI' using rules from a biased dataset, like favoring sports hobbies for job candidates. They test it on diverse profiles, note unfair results, and revise rules with balanced criteria. Share findings in a class gallery walk.

Evaluate the importance of using 'fair' datasets when training AI models.

Facilitation TipIn Pairs Simulation, assign one student to record the rules and the other to test them, ensuring both roles engage actively with the process.

What to look forAsk students to write down one example of AI bias they learned about and explain in one sentence why fairness is important when creating AI. Collect these to gauge understanding of key concepts.

AnalyzeEvaluateCreateSelf-ManagementDecision-Making
Generate Complete Lesson

Activity 03

Formal Debate45 min · Whole Class

Whole Class Debate: Fair Data Matters

Divide the class into teams to debate using biased versus fair datasets for an AI hiring tool. Provide evidence cards; teams prepare 2-minute arguments. Vote and reflect on why fairness affects society.

Justify why it is crucial to consider fairness when designing AI systems.

Facilitation TipFor the Whole Class Debate, assign roles like ‘AI developer’ or ‘advocate for fairness’ to keep all students accountable for reasoned arguments.

What to look forShow students two sets of images: one set with balanced representation of different people and one set with clear underrepresentation of a group. Ask: 'Which set of images would be better for training an AI to recognize people? Why?'

AnalyzeEvaluateCreateSelf-ManagementDecision-Making
Generate Complete Lesson

Activity 04

Formal Debate25 min · Individual

Individual Test: Online AI Audit

Students access safe, free AI tools like image labelers. They input diverse photos from school resources and log accuracy differences. Compile results on a shared chart to identify patterns.

Analyze how a computer program can inherit biases from its creators or training data.

Facilitation TipIn the Individual Test, provide headphones so students can focus on the AI audit without distractions from peers.

What to look forPresent students with a scenario: 'An AI is designed to recommend books. It was trained only on books written by male authors. What kind of bias might this AI show? How could we make it fairer?' Facilitate a class discussion, guiding them to identify the source of bias and brainstorm solutions.

AnalyzeEvaluateCreateSelf-ManagementDecision-Making
Generate Complete Lesson

A few notes on teaching this unit

Teachers should approach this topic by grounding discussions in students’ lived experiences of technology, using relatable examples like school apps or games. Avoid abstract definitions; instead, let students discover bias through hands-on tasks. Research shows that when children create and test their own ‘rules’ for AI, they better grasp how bias spreads and how to address it.

Successful learning looks like students confidently identifying bias sources in datasets and justifying inclusive design choices in discussions. They should articulate why fairness matters and propose specific fixes, using clear examples from their activities.


Watch Out for These Misconceptions

  • During Group Sort, watch for students who say, 'AI is always fair because computers don’t have opinions.' Redirect them by asking, 'Look at the labels on these dataset cards. Who chose these? Could their choices reflect human opinions? How?'

    During Group Sort, students will notice that datasets often lack labels for certain groups. Ask them to explain why those groups might be missing and who decided what to include or exclude in the data they are sorting.

  • During Pairs Simulation, watch for students who claim, 'Bias only affects advanced AI used by big companies.' Pause the activity and ask, 'What rules did you just write for your simple AI? Could those rules favor one group without you realizing it?'

    During Pairs Simulation, remind students that the rules they create mimic real-world biases. Have them swap rules with another pair and test if the new rules produce different outcomes, highlighting how bias can start small.

  • During Whole Class Debate, watch for students who insist, 'Once trained, AI bias cannot be changed.' Counter with, 'If your AI keeps favoring one name, what could you do to fix it?'

    During Whole Class Debate, use the example from Group Sort to show how retraining with balanced data reduces bias. Ask students to describe the steps they would take to ‘retrain’ their simulated AI to be fairer.


Methods used in this brief