Skip to content
Computer Science · 9th Grade

Active learning ideas

AI and Societal Inequality

Active learning turns abstract concerns about AI and inequality into tangible questions students can investigate themselves. When students analyze real cases, debate trade-offs, and redesign policies, they move beyond passive listening to see how technical choices shape human lives.

Common Core State StandardsCSTA: 3A-IC-24CSTA: 3A-IC-25
20–50 minPairs → Whole Class4 activities

Activity 01

Case Study Analysis45 min · Small Groups

Case Study Analysis: AI in High-Stakes Decisions

Provide three short case studies (e.g., COMPAS recidivism scoring, Amazon's hiring algorithm, facial recognition accuracy disparities). Groups analyze each using a structured template: What does the AI do? Who benefits? Who is harmed? What data was it trained on? Groups present findings and the class maps patterns across all three cases.

Analyze how AI exacerbates existing inequalities in society.

Facilitation TipDuring the Case Study Analysis, assign each student a role (data scientist, community member, ethicist) so they must defend a perspective grounded in their role’s concerns.

What to look forPresent students with a hypothetical scenario: An AI system is proposed to help allocate scholarships. Ask them: 'What potential biases could be embedded in this system? How might this AI affect students from different socioeconomic backgrounds differently? What questions should we ask before deploying it?'

AnalyzeEvaluateCreateDecision-MakingSelf-Management
Generate Complete Lesson

Activity 02

Structured Academic Controversy: Is AI Making Inequality Worse?

Pairs are assigned a position (AI is widening inequality / AI is reducing inequality) and prepare arguments with provided readings. They argue their assigned side for 5 minutes each, then switch positions, then work together to write a nuanced joint statement. The goal is to hold complexity rather than win the debate.

Predict the impact of AI on different socioeconomic groups.

Facilitation TipIn the Structured Academic Controversy, require students to open by summarizing their opponents’ strongest point before stating their own.

What to look forProvide students with a short news clipping about an AI application (e.g., AI in loan approvals). Ask them to identify one way the AI might exacerbate existing inequalities and one way it might benefit society. They should write their answers in 2-3 sentences each.

AnalyzeEvaluateCreateSocial AwarenessRelationship Skills
Generate Complete Lesson

Activity 03

Think-Pair-Share20 min · Pairs

Think-Pair-Share: Who Designed This?

Show a series of AI product screenshots or feature descriptions. Students independently note who they imagine built it, who the intended user is, and whose needs might not have been considered. Partner discussion and whole-class debrief surfaces assumptions students had not noticed they were making about the default user.

Design policy recommendations to mitigate AI's negative impact on inequality.

Facilitation TipFor the Think-Pair-Share, ask students to first write their initial answers alone, then compare with a partner, and finally share out to reduce social pressure to conform.

What to look forAsk students to write down one specific example of AI being used in a way that could create or worsen inequality. Then, ask them to propose one concrete step a developer or policymaker could take to address this specific issue.

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
Generate Complete Lesson

Activity 04

World Café50 min · Small Groups

Policy Design Workshop: Mitigating AI Inequality

Small groups are each assigned a sector (healthcare, education, hiring, criminal justice). They identify one specific inequality AI creates or worsens in that sector, then draft a policy recommendation with a rationale and at least one counterargument. Groups share proposals and receive structured feedback from peers.

Analyze how AI exacerbates existing inequalities in society.

Facilitation TipIn the Policy Design Workshop, give teams a limited number of sticky notes to force prioritization and trade-off thinking.

What to look forPresent students with a hypothetical scenario: An AI system is proposed to help allocate scholarships. Ask them: 'What potential biases could be embedded in this system? How might this AI affect students from different socioeconomic backgrounds differently? What questions should we ask before deploying it?'

UnderstandApplyAnalyzeSocial AwarenessRelationship Skills
Generate Complete Lesson

A few notes on teaching this unit

Teach this topic by keeping the focus on mechanisms, not moralizing. Guide students to trace how historical data choices, metric design, and deployment contexts create disparate impacts, rather than asking them to debate whether AI is ‘good’ or ‘bad.’ Use structured controversy to normalize disagreement while insisting on evidence. Research shows that when students articulate trade-offs early, they later design more inclusive solutions.

By the end of these activities, students will identify how data, algorithms, and deployment contexts produce unequal outcomes, and they will articulate specific fairness concerns in technical language. They will also propose design or policy changes that address those concerns.


Watch Out for These Misconceptions

  • During Case Study Analysis, watch for students asserting that AI is objective because it uses data and math.

    Redirect them to the case study’s dataset description. Ask them to list the source, time period, and demographic breakdown of the data. Then guide them to connect those attributes to documented historical biases in that domain.

  • During Structured Academic Controversy, watch for students equating ‘treating everyone the same’ with fairness.

    Have teams define their own fairness metric for the scenario, then compare how each metric affects different groups. Require them to present at least one equitable alternative to equal treatment.

  • During Think-Pair-Share, watch for students assuming only direct users are affected by AI systems.

    Provide the prompt: ‘Describe one group that benefits from an AI loan approval system, and one group that may be harmed, even if they never interact with the interface.’ Use their responses to introduce the concept of indirect harms.


Methods used in this brief