Skip to content
Computing · Secondary 3

Active learning ideas

Bias in AI and Algorithmic Fairness

Active learning works for this topic because bias in AI is not a theoretical issue. It shows up in real datasets and tools students use daily. When students examine concrete examples and redesign scenarios, they see how data choices affect people’s lives in ways that passive lectures cannot match.

MOE Syllabus OutcomesMOE: Ethics and Social Issues - S3
25–50 minPairs → Whole Class4 activities

Activity 01

Think-Pair-Share45 min · Small Groups

Small Groups: Real-World Case Audit

Provide groups with a case study on biased AI, like facial recognition failures. Students identify bias sources in data, evaluate impacts, and propose three fairness fixes. Groups share audits in a class gallery walk.

Analyze how biases in training data can lead to discriminatory AI outcomes.

Facilitation TipDuring the Real-World Case Audit, assign each small group a different dataset to analyze so the class covers multiple types of bias in one lesson.

What to look forPresent students with a scenario: An AI system is developed to help Singaporean banks approve loan applications. Ask them: 'What kinds of biases might be present in the training data? How could these biases lead to unfair loan rejections for certain communities in Singapore? What steps should the bank take to ensure fairness?'

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
Generate Complete Lesson

Activity 02

Think-Pair-Share35 min · Pairs

Pairs: Bias Scenario Redesign

Pairs design a hypothetical AI system with embedded bias leading to social injustice, then redesign it for fairness using strategies like balanced datasets. They sketch system flows and present changes.

Justify the importance of auditing AI systems for fairness and transparency.

Facilitation TipFor the Bias Scenario Redesign, provide sentence starters to help pairs articulate which features of the scenario reveal bias and how their redesign addresses it.

What to look forProvide students with a short description of an AI application (e.g., an AI tutor, a content recommendation engine). Ask them to identify one potential source of bias in its training data and one specific strategy they would use to make the AI fairer. Collect these as students leave the class.

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
Generate Complete Lesson

Activity 03

Think-Pair-Share50 min · Whole Class

Whole Class: Fairness Debate

Divide class into teams to debate mandatory AI audits: one side argues benefits for equity, the other potential innovation barriers. Use structured turns and vote on strongest points.

Design a hypothetical scenario where AI bias could lead to significant social injustice.

Facilitation TipIn the Fairness Debate, assign roles clearly and give students two minutes to prepare opening statements using facts from their previous activities.

What to look forDisplay a list of AI fairness strategies (e.g., data diversification, bias detection tools, human oversight). Ask students to match each strategy to a brief description of how it helps mitigate AI bias. This can be done as a short quiz or a drag-and-drop activity.

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
Generate Complete Lesson

Activity 04

Think-Pair-Share25 min · Individual

Individual: Personal Bias Checklist

Students create a checklist for auditing AI fairness, drawing from class learnings. Test it on a provided algorithm example and reflect on one improvement.

Analyze how biases in training data can lead to discriminatory AI outcomes.

Facilitation TipDuring the Personal Bias Checklist, model how to reflect on one item from the checklist before asking students to complete the rest independently.

What to look forPresent students with a scenario: An AI system is developed to help Singaporean banks approve loan applications. Ask them: 'What kinds of biases might be present in the training data? How could these biases lead to unfair loan rejections for certain communities in Singapore? What steps should the bank take to ensure fairness?'

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
Generate Complete Lesson

A few notes on teaching this unit

Experienced teachers approach this topic by grounding abstract concepts in local examples students recognize. They avoid overwhelming students with technical jargon, instead focusing on how data choices impact real people. Research shows that structured discussions after hands-on tasks build deeper understanding than lectures alone.

Successful learning looks like students explaining how biased data leads to unfair outcomes, proposing concrete fairness fixes, and justifying their choices with evidence from the case studies. They should move from noticing problems to suggesting actionable solutions.


Watch Out for These Misconceptions

  • During the Real-World Case Audit, watch for students assuming AI is unbiased because it uses math.

    Use the dataset handouts to guide students to identify human sources of bias in the data, such as underrepresentation or mislabeling, and discuss why these affect algorithmic outcomes.

  • During the Bias Scenario Redesign, watch for students believing better coding can remove all bias.

    Have pairs list trade-offs of their redesigns, such as reduced accuracy for certain groups, to highlight that fairness often requires ongoing adjustments rather than perfect solutions.

  • During the Fairness Debate, watch for students dismissing AI bias as irrelevant to Singapore.

    Use Singapore-based case studies in the debate prompts and ask students to find local examples of biased AI, ensuring connections to their own context.


Methods used in this brief