Activity 01
Case Study Analysis: COMPAS and Hiring Algorithms
Assign small groups one of two documented bias cases (COMPAS criminal risk scoring or Amazon's hiring algorithm). Groups read a summary, identify where bias entered the system, and present findings to the class using a structured claim-evidence-reasoning format.
Analyze how human biases can be inadvertently encoded into AI algorithms.
Facilitation TipIn Case Study Analysis, assign roles so each student traces a different entry point for bias in the COMPAS system.
What to look forPresent students with a case study, such as a biased AI in college admissions. Ask: 'Identify at least two ways bias could have entered this system. Discuss the potential consequences for applicants from underrepresented groups. What is one specific step an engineer could take to address this bias?'