Activity 01
Case Study Analysis: Real-World AI Bias
Provide articles on cases like COMPAS recidivism tool or Amazon hiring AI. In small groups, students identify biased data sources, predict outcomes, and propose fixes. Groups present findings to class for peer feedback.
Who is responsible when an autonomous system makes an unethical decision?
Facilitation TipDuring Case Study Analysis, assign roles like 'data scientist' or 'ethicist' to push students beyond surface observations.
What to look forPresent students with a hypothetical scenario: An AI system designed to recommend job candidates shows a strong preference for applicants from specific universities. Ask: 'Who is responsible if this system perpetuates inequality? What steps could the developers take to identify and address this bias before deployment?'