Skip to content
Computer Science · 9th Grade · Computational Thinking and Problem Solving · Weeks 1-9

Ethical Considerations in Problem Solving

Students will discuss the ethical implications of designing solutions and the potential for bias.

Common Core State StandardsCSTA: 3A-IC-24

About This Topic

Ethics in computing is not a separate module -- it is woven into every design decision. For 9th graders, this topic introduces the idea that the way a problem is framed, decomposed, and solved can either reflect or amplify existing societal biases. In the United States context, students can examine real cases: facial recognition systems that perform worse on darker-skinned faces, or predictive policing tools that disproportionately flag historically over-policed neighborhoods. These are not hypothetical -- they are documented failures with documented consequences.

Bias can enter a solution at multiple stages. During problem decomposition, the choice of what to measure and what to ignore encodes assumptions about who the system is for. During data collection, if the training data reflects historical inequalities, the solution will reproduce those inequalities regardless of how neutral the math appears. Students learn to ask 'who benefits from this design?' and 'who might be harmed?' at every stage, not just at the end.

Active learning is critical here because ethical reasoning is not a formula. Students need to hear from each other, defend positions, and encounter perspectives they had not considered. Structured debate and role-play activities surface the genuine complexity in ways that passive instruction cannot.

Key Questions

  1. Analyze how biases can be introduced during the problem decomposition phase.
  2. Justify the importance of considering ethical implications early in the design process.
  3. Predict the societal impact of a solution that overlooks ethical considerations.

Learning Objectives

  • Analyze how implicit assumptions in problem decomposition can lead to biased algorithmic outcomes.
  • Evaluate the ethical trade-offs of a proposed technological solution by considering potential harms to specific user groups.
  • Design a mitigation strategy to address identified biases in a computational problem-solving process.
  • Justify the inclusion of diverse perspectives during the problem-solving lifecycle to prevent unintended negative societal impacts.

Before You Start

Introduction to Computational Thinking

Why: Students need a foundational understanding of how problems are broken down and represented computationally before analyzing bias within that process.

Basic Data Representation

Why: Understanding how data is collected and structured is essential for recognizing how biases can be encoded within datasets used for solutions.

Key Vocabulary

Algorithmic BiasSystematic and repeatable errors in a computer system that create unfair outcomes, such as privileging one arbitrary group of users over others.
Problem DecompositionThe process of breaking down a complex problem into smaller, more manageable sub-problems. This stage can introduce bias through the choices made about what data is considered relevant or irrelevant.
Societal ImpactThe effect of a technology or solution on the structure, behavior, and values of a society, including both intended and unintended consequences.
Fairness in AIThe principle that artificial intelligence systems should treat all individuals and groups equitably, avoiding discrimination or prejudice in their decision-making processes.

Watch Out for These Misconceptions

Common MisconceptionEthical issues in computing only matter after the product is shipped.

What to Teach Instead

Ethical problems are far harder to fix after deployment. The cost of retraining a biased model or redesigning a flawed system is orders of magnitude higher than addressing bias in the design phase. Role-play activities that put students in the shoes of affected stakeholders make the early-design stakes concrete.

Common MisconceptionAlgorithms are neutral because they follow math, not opinions.

What to Teach Instead

Algorithms reflect the choices of the people who built them: what data to use, what outcome to optimize for, what edge cases to handle. Group analysis of real-world algorithmic failures shows students the human decisions embedded in every line of code.

Active Learning Ideas

See all activities

Real-World Connections

  • Facial recognition software used by law enforcement agencies has shown documented disparities in accuracy across different racial and gender groups, leading to wrongful accusations.
  • Hiring algorithms designed to screen resumes have been found to perpetuate historical gender biases by favoring candidates with profiles similar to previously successful, often male, employees.
  • Social media platforms use algorithms to curate content, which can inadvertently create echo chambers or amplify misinformation, impacting public discourse and individual perceptions.

Assessment Ideas

Discussion Prompt

Present students with a scenario: 'A city wants to use an AI system to predict where to allocate resources for after-school programs. What are two potential biases that could be introduced during problem decomposition, and who might be negatively impacted?' Facilitate a class discussion on their responses.

Quick Check

Provide students with a brief description of a hypothetical app designed to help people find local volunteer opportunities. Ask them to identify one ethical consideration and one potential bias, and write one sentence explaining why it matters for the app's users.

Exit Ticket

Students write down one question they would ask a software developer to ensure their product is ethically designed. They should also explain in one sentence why asking this question is important for preventing bias.

Frequently Asked Questions

How can bias get into a computer program?
Bias enters code through the data used to train it, the decisions about what to measure, and the assumptions built into the algorithm's rules. If historical data reflects discrimination, a model trained on it will reproduce that discrimination regardless of how neutral the math appears.
What does it mean to consider ethics during problem decomposition?
When you break a problem into parts, you decide what matters and what does not. Those decisions encode values. Decomposing 'who gets a loan' by credit score versus zip code versus personal history leads to systems with very different fairness profiles, even when the math in each case is technically correct.
Why is it important for 9th graders to think about ethics in computing?
High school students will build, use, and be affected by software systems for the rest of their lives. Understanding that technical choices have social consequences prepares them to be more responsible builders and more informed citizens when evaluating technology's role in society.
How does active learning help students engage with computing ethics?
Ethical dilemmas do not have right answers a teacher can simply deliver. When students argue different stakeholder positions in a fishbowl or map bias entry points collaboratively on a whiteboard, they encounter the genuine difficulty of the problems. That productive struggle builds the habit of ethical questioning more durably than reading about it.