Activity 01
Format Name: Bias in Hiring AI Simulation
Students are given a dataset and a simplified AI model designed to screen job applications. They analyze the dataset for potential biases (e.g., gender, ethnicity) and then run the AI, observing how these biases affect the outcomes. Discussion follows on how to mitigate these issues.
Analyze how bias in data can lead to unfair decisions by AI.
Facilitation TipIn Case Study Circles, assign roles (data collector, bias spotter, impact assessor) to ensure all students contribute, especially shy participants.