Activity 01
Case Study Analysis: Real Algorithmic Bias
Groups each receive one documented case of algorithmic bias (COMPAS, Amazon hiring tool, facial recognition accuracy, predictive policing). Each group identifies the source of bias, the affected group, and the real-world harm. Groups present their case using a shared analysis template, then the class maps patterns across all cases.
Analyze how human prejudices can be encoded into software and the resulting social impact.
Facilitation TipDuring Case Study Analysis, assign each group a different real-world case so the class collectively sees multiple entry points for bias.
What to look forProvide students with a brief description of a hypothetical AI system (e.g., a loan application screener). Ask them to write one sentence identifying a potential source of bias (data or design) and one sentence explaining how it could lead to unfair outcomes.