Skip to content

Ethical Considerations in Problem SolvingActivities & Teaching Strategies

Ethical computing is abstract until students see its real-world effects. Active learning works here because bias in algorithms is not theoretical — it shows up in data, design choices, and outcomes. When students analyze real cases, role-play stakeholders, and debate trade-offs, they move from passive observers to accountable designers.

9th GradeComputer Science4 activities20 min35 min

Learning Objectives

  1. 1Analyze how implicit assumptions in problem decomposition can lead to biased algorithmic outcomes.
  2. 2Evaluate the ethical trade-offs of a proposed technological solution by considering potential harms to specific user groups.
  3. 3Design a mitigation strategy to address identified biases in a computational problem-solving process.
  4. 4Justify the inclusion of diverse perspectives during the problem-solving lifecycle to prevent unintended negative societal impacts.

Want a complete lesson plan with these objectives? Generate a Mission

35 min·Whole Class

Fishbowl Discussion: Algorithmic Bias Case Study

The inner circle of four or five students debates a real case of algorithmic bias, such as Amazon's resume-screening tool that downgraded resumes mentioning 'women's.' The outer circle observes and takes notes on the reasoning used. Groups then swap roles and continue the discussion.

Prepare & details

Analyze how biases can be introduced during the problem decomposition phase.

Facilitation Tip: During the Fishbowl Discussion, assign students specific roles (data scientist, community member, policy maker) to ensure multiple perspectives are heard without repetition.

Setup: Inner circle of 4-6 chairs, outer circle surrounding them

Materials: Discussion prompt or essential question, Observation notes template

AnalyzeEvaluateSocial AwarenessSelf-Awareness
30 min·Small Groups

Role-Play: Stakeholder Mapping

Groups of four each receive a different stakeholder card (student, parent, teacher, school administrator, student with a disability) for the same AI-graded homework system. Each person advocates for their stakeholder's perspective, then the group identifies where the design most needs ethical scrutiny.

Prepare & details

Justify the importance of considering ethical implications early in the design process.

Facilitation Tip: In the Role-Play activity, provide a short preparatory reading so students can internalize their stakeholder’s values before the mapping begins.

Setup: Chairs arranged in two concentric circles

Materials: Discussion question/prompt (projected), Observation rubric for outer circle

AnalyzeEvaluateCreateSocial AwarenessRelationship Skills
20 min·Pairs

Think-Pair-Share: Bias Entry Points

Students individually identify three points where bias could enter the design of a recommendation algorithm. They pair up to compare their lists, then contribute to a class-wide map on the whiteboard organized by design phase.

Prepare & details

Predict the societal impact of a solution that overlooks ethical considerations.

Facilitation Tip: For the Think-Pair-Share on bias entry points, limit the ‘pair’ step to two minutes to maintain energy and prevent repetition.

Setup: Standard classroom seating; students turn to a neighbor

Materials: Discussion prompt (projected or printed), Optional: recording sheet for pairs

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
25 min·Whole Class

Gallery Walk: Ethical Frameworks

Post four posters representing different ethical frameworks: utilitarian, rights-based, fairness and equity, and care ethics. Students rotate and write one computing example for each framework, then the class discusses which framework is most commonly applied in industry and which is most often ignored.

Prepare & details

Analyze how biases can be introduced during the problem decomposition phase.

Facilitation Tip: During the Gallery Walk of ethical frameworks, assign each group a different framework and have them post their summary on the wall before rotating.

Setup: Wall space or tables arranged around room perimeter

Materials: Large paper/poster boards, Markers, Sticky notes for feedback

UnderstandApplyAnalyzeCreateRelationship SkillsSocial Awareness

Teaching This Topic

Teachers should treat ethics as a design constraint, not a separate lesson. Use real cases with measurable harm so students feel the weight of early choices. Avoid moralizing — instead, ask students to trace how data selection or problem framing leads to biased outcomes. Research shows that when students analyze documented failures, they internalize ethical responsibility more deeply than with hypotheticals.

What to Expect

Students will move from recognizing bias as a distant idea to identifying specific ways bias can enter a system during design. They will articulate who is affected and why, and connect their own design decisions to broader social impact. Evidence of learning includes clear references to data choices, outcome definitions, or edge cases they would question.

These activities are a starting point. A full mission is the experience.

  • Complete facilitation script with teacher dialogue
  • Printable student materials, ready for class
  • Differentiation strategies for every learner
Generate a Mission

Watch Out for These Misconceptions

Common MisconceptionDuring Fishbowl Discussion: Watch for statements like ‘Ethics only matters after launch.’ Redirect by asking the group to consider the cost of fixing bias post-deployment using the facial recognition case as evidence.

What to Teach Instead

Use the Fishbowl Discussion to reframe ethics as an early design constraint by asking students to identify design choices that could prevent bias before any code is written.

Common MisconceptionDuring Role-Play: Watch for comments like ‘Algorithms are neutral because they’re based on math.’ Redirect by having students examine the data sets their stakeholders would realistically use and who collected them.

What to Teach Instead

During the Role-Play, have students list data sources and decision criteria their stakeholders would choose, then analyze how those choices reflect human values and potential biases.

Assessment Ideas

Discussion Prompt

After the Fishbowl Discussion, present students with a scenario: ‘A city wants to use an AI system to predict where to allocate resources for after-school programs. What are two potential biases that could be introduced during problem decomposition, and who might be negatively impacted?’ Facilitate a class discussion on their responses.

Quick Check

During Think-Pair-Share, provide students with a brief description of a hypothetical app designed to help people find local volunteer opportunities. Ask them to identify one ethical consideration and one potential bias, and write one sentence explaining why it matters for the app's users.

Exit Ticket

During the Gallery Walk, students write down one question they would ask a software developer to ensure their product is ethically designed. They should also explain in one sentence why asking this question is important for preventing bias.

Extensions & Scaffolding

  • Challenge: Have students research a real algorithmic system (e.g., hiring tools, social media feeds) and present a 2-minute analysis of one bias entry point and one redesign to mitigate it.
  • Scaffolding: Provide sentence stems like ‘This bias could happen when we…’ or ‘The group most affected would be…’ to support students in articulating their ideas.
  • Deeper Exploration: Connect the lesson to civil rights history by asking students to compare algorithmic bias to historical redlining patterns in their community.

Key Vocabulary

Algorithmic BiasSystematic and repeatable errors in a computer system that create unfair outcomes, such as privileging one arbitrary group of users over others.
Problem DecompositionThe process of breaking down a complex problem into smaller, more manageable sub-problems. This stage can introduce bias through the choices made about what data is considered relevant or irrelevant.
Societal ImpactThe effect of a technology or solution on the structure, behavior, and values of a society, including both intended and unintended consequences.
Fairness in AIThe principle that artificial intelligence systems should treat all individuals and groups equitably, avoiding discrimination or prejudice in their decision-making processes.

Ready to teach Ethical Considerations in Problem Solving?

Generate a full mission with everything you need

Generate a Mission
Ethical Considerations in Problem Solving: Activities & Teaching Strategies — 9th Grade Computer Science | Flip Education