Skip to content
Computer Science · 9th Grade

Active learning ideas

Ethical Considerations in Problem Solving

Ethical computing is abstract until students see its real-world effects. Active learning works here because bias in algorithms is not theoretical — it shows up in data, design choices, and outcomes. When students analyze real cases, role-play stakeholders, and debate trade-offs, they move from passive observers to accountable designers.

Common Core State StandardsCSTA: 3A-IC-24
20–35 minPairs → Whole Class4 activities

Activity 01

Fishbowl Discussion35 min · Whole Class

Fishbowl Discussion: Algorithmic Bias Case Study

The inner circle of four or five students debates a real case of algorithmic bias, such as Amazon's resume-screening tool that downgraded resumes mentioning 'women's.' The outer circle observes and takes notes on the reasoning used. Groups then swap roles and continue the discussion.

Analyze how biases can be introduced during the problem decomposition phase.

Facilitation TipDuring the Fishbowl Discussion, assign students specific roles (data scientist, community member, policy maker) to ensure multiple perspectives are heard without repetition.

What to look forPresent students with a scenario: 'A city wants to use an AI system to predict where to allocate resources for after-school programs. What are two potential biases that could be introduced during problem decomposition, and who might be negatively impacted?' Facilitate a class discussion on their responses.

AnalyzeEvaluateSocial AwarenessSelf-Awareness
Generate Complete Lesson

Activity 02

Socratic Seminar30 min · Small Groups

Role-Play: Stakeholder Mapping

Groups of four each receive a different stakeholder card (student, parent, teacher, school administrator, student with a disability) for the same AI-graded homework system. Each person advocates for their stakeholder's perspective, then the group identifies where the design most needs ethical scrutiny.

Justify the importance of considering ethical implications early in the design process.

Facilitation TipIn the Role-Play activity, provide a short preparatory reading so students can internalize their stakeholder’s values before the mapping begins.

What to look forProvide students with a brief description of a hypothetical app designed to help people find local volunteer opportunities. Ask them to identify one ethical consideration and one potential bias, and write one sentence explaining why it matters for the app's users.

AnalyzeEvaluateCreateSocial AwarenessRelationship Skills
Generate Complete Lesson

Activity 03

Think-Pair-Share20 min · Pairs

Think-Pair-Share: Bias Entry Points

Students individually identify three points where bias could enter the design of a recommendation algorithm. They pair up to compare their lists, then contribute to a class-wide map on the whiteboard organized by design phase.

Predict the societal impact of a solution that overlooks ethical considerations.

Facilitation TipFor the Think-Pair-Share on bias entry points, limit the ‘pair’ step to two minutes to maintain energy and prevent repetition.

What to look forStudents write down one question they would ask a software developer to ensure their product is ethically designed. They should also explain in one sentence why asking this question is important for preventing bias.

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
Generate Complete Lesson

Activity 04

Gallery Walk25 min · Whole Class

Gallery Walk: Ethical Frameworks

Post four posters representing different ethical frameworks: utilitarian, rights-based, fairness and equity, and care ethics. Students rotate and write one computing example for each framework, then the class discusses which framework is most commonly applied in industry and which is most often ignored.

Analyze how biases can be introduced during the problem decomposition phase.

Facilitation TipDuring the Gallery Walk of ethical frameworks, assign each group a different framework and have them post their summary on the wall before rotating.

What to look forPresent students with a scenario: 'A city wants to use an AI system to predict where to allocate resources for after-school programs. What are two potential biases that could be introduced during problem decomposition, and who might be negatively impacted?' Facilitate a class discussion on their responses.

UnderstandApplyAnalyzeCreateRelationship SkillsSocial Awareness
Generate Complete Lesson

A few notes on teaching this unit

Teachers should treat ethics as a design constraint, not a separate lesson. Use real cases with measurable harm so students feel the weight of early choices. Avoid moralizing — instead, ask students to trace how data selection or problem framing leads to biased outcomes. Research shows that when students analyze documented failures, they internalize ethical responsibility more deeply than with hypotheticals.

Students will move from recognizing bias as a distant idea to identifying specific ways bias can enter a system during design. They will articulate who is affected and why, and connect their own design decisions to broader social impact. Evidence of learning includes clear references to data choices, outcome definitions, or edge cases they would question.


Watch Out for These Misconceptions

  • During Fishbowl Discussion: Watch for statements like ‘Ethics only matters after launch.’ Redirect by asking the group to consider the cost of fixing bias post-deployment using the facial recognition case as evidence.

    Use the Fishbowl Discussion to reframe ethics as an early design constraint by asking students to identify design choices that could prevent bias before any code is written.

  • During Role-Play: Watch for comments like ‘Algorithms are neutral because they’re based on math.’ Redirect by having students examine the data sets their stakeholders would realistically use and who collected them.

    During the Role-Play, have students list data sources and decision criteria their stakeholders would choose, then analyze how those choices reflect human values and potential biases.


Methods used in this brief