Activity 01
Group Sort: Spotting Dataset Bias
Give small groups printed cards with images or profiles representing a hiring dataset. Students sort into 'hire' or 'not hire' piles, then discuss imbalances like gender or ethnicity skews. Groups redesign the dataset for fairness and predict improved AI outcomes.
Analyze how a computer program can inherit biases from its creators or training data.
Facilitation TipDuring Group Sort, circulate with a checklist to note which students quickly spot underrepresented groups or labels, reinforcing early those who hesitate.
What to look forPresent students with a scenario: 'An AI is designed to recommend books. It was trained only on books written by male authors. What kind of bias might this AI show? How could we make it fairer?' Facilitate a class discussion, guiding them to identify the source of bias and brainstorm solutions.