Activity 01
Small Groups: Real-World Case Audit
Provide groups with a case study on biased AI, like facial recognition failures. Students identify bias sources in data, evaluate impacts, and propose three fairness fixes. Groups share audits in a class gallery walk.
Analyze how biases in training data can lead to discriminatory AI outcomes.
Facilitation TipDuring the Real-World Case Audit, assign each small group a different dataset to analyze so the class covers multiple types of bias in one lesson.
What to look forPresent students with a scenario: An AI system is developed to help Singaporean banks approve loan applications. Ask them: 'What kinds of biases might be present in the training data? How could these biases lead to unfair loan rejections for certain communities in Singapore? What steps should the bank take to ensure fairness?'