Skip to content
Computer Science · 9th Grade

Active learning ideas

Identifying Bias in AI Outputs

Active learning works well for bias detection because students need to experience how bias hides in plain sight. When they manually test AI outputs with varied inputs, they see firsthand how statistical gaps and wording choices create unfair results. This hands-on work makes abstract concepts tangible.

Common Core State StandardsCSTA: 3A-IC-25
25–45 minPairs → Whole Class4 activities

Activity 01

Case Study Analysis45 min · Pairs

Bias Audit: Image Captioning Tool

Give students access to a free image captioning or labeling tool (several are available online). Students systematically test it with a set of images they design: varying gender presentation, skin tone, age, and context. They record outputs in a table, identify patterns, and write a two-paragraph audit finding with supporting evidence.

Identify examples of biased outputs from AI systems.

Facilitation TipDuring the Bias Audit, have students compare their findings in small groups before presenting to the class to normalize the discomfort of identifying bias in tools they use daily.

What to look forProvide students with a hypothetical AI output (e.g., a job recommendation, a news summary). Ask them to write one sentence identifying a potential bias and one sentence suggesting a source for that bias.

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

Activity 02

Case Study Analysis30 min · Small Groups

Error Rate Disaggregation: Simulated Dataset

Provide a pre-built table of simulated AI decisions (loan approvals, image classifications, or content flags) with demographic information included. Groups calculate error rates for each demographic group and compare. Groups then identify which metric , overall accuracy, false positive rate, false negative rate , reveals the bias most clearly.

Analyze the potential sources of bias that lead to unfair AI outcomes.

Facilitation TipFor Error Rate Disaggregation, provide a pre-filled spreadsheet so students focus on analysis rather than data entry, but require them to explain each calculation in their own words.

What to look forPresent students with two sets of AI-generated image descriptions for the same prompt, one set potentially biased. Ask: 'Which set of descriptions seems more fair? Why? What specific words or phrases suggest bias?'

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

Activity 03

Think-Pair-Share25 min · Pairs

Think-Pair-Share: What Would Fair Look Like?

Present two definitions of fairness for a loan approval AI: (1) equal approval rates across groups, (2) equal error rates across groups. Students individually argue which definition is more appropriate for this context. Pairs share, then the class discusses whether these two definitions can both be satisfied simultaneously (they mathematically often cannot).

Propose simple strategies to mitigate bias in AI systems.

Facilitation TipIn the Think-Pair-Share, give students a strict two-minute timer for the 'think' phase to prevent overanalysis and keep the conversation moving.

What to look forPose the question: 'Imagine you are designing an AI to help students choose extracurricular activities. What steps would you take during data collection and model design to prevent bias related to socioeconomic status or access to resources?' Facilitate a brief class discussion.

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
Generate Complete Lesson

Activity 04

Case Study Analysis35 min · Small Groups

Mitigation Strategy Design: Fix One Source

Groups receive a biased AI scenario with a clearly identified bias source (underrepresented group in training data, biased labeling, proxy variable). Each group proposes one concrete mitigation strategy, describes what it would require, and identifies its limitations. Groups evaluate each other's proposals for feasibility and side effects.

Identify examples of biased outputs from AI systems.

Facilitation TipWhen designing mitigation strategies, limit the fix to one source of bias to avoid overwhelming students with complexity.

What to look forProvide students with a hypothetical AI output (e.g., a job recommendation, a news summary). Ask them to write one sentence identifying a potential bias and one sentence suggesting a source for that bias.

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

A few notes on teaching this unit

Teachers should frame bias detection as a detective skill rather than a technical one. Start with low-stakes examples where students can easily spot issues, then gradually introduce subtler cases. Avoid framing this as a coding exercise unless students have advanced skills. Research shows that structured questioning and systematic comparison work better than abstract lectures for developing critical evaluation skills.

Students will move from spotting obvious biases to analyzing nuanced patterns in AI outputs. They will articulate where bias comes from and propose concrete steps to reduce it. By the end, they should confidently question AI results instead of accepting them at face value.


Watch Out for These Misconceptions

  • During the Bias Audit: Image Captioning Tool, students may assume bias is always visible in the first glance at the output.

    During the Bias Audit, remind students that many biases are hidden in aggregated data. Have them disaggregate their results by demographic groups and compare error rates to reveal subtle patterns that individual examples might mask.

  • During Error Rate Disaggregation: Simulated Dataset, students might think equal overall accuracy means fairness.

    During Error Rate Disaggregation, ask students to compare false positive and false negative rates across groups. Provide a scenario, like a hiring tool, where different error types have unequal real-world costs to push them beyond simple accuracy metrics.

  • During Mitigation Strategy Design: Fix One Source, students may assume they need advanced programming to address bias.

    During Mitigation Strategy Design, emphasize that many fixes require only changes to prompts, data collection, or evaluation criteria. Have students draft a revised prompt or data-gathering question as their mitigation plan, demonstrating that bias reduction can start without coding.


Methods used in this brief