Ethical Considerations in Problem Solving
Students will discuss the ethical implications of designing solutions and the potential for bias.
About This Topic
Ethics in computing is not a separate module -- it is woven into every design decision. For 9th graders, this topic introduces the idea that the way a problem is framed, decomposed, and solved can either reflect or amplify existing societal biases. In the United States context, students can examine real cases: facial recognition systems that perform worse on darker-skinned faces, or predictive policing tools that disproportionately flag historically over-policed neighborhoods. These are not hypothetical -- they are documented failures with documented consequences.
Bias can enter a solution at multiple stages. During problem decomposition, the choice of what to measure and what to ignore encodes assumptions about who the system is for. During data collection, if the training data reflects historical inequalities, the solution will reproduce those inequalities regardless of how neutral the math appears. Students learn to ask 'who benefits from this design?' and 'who might be harmed?' at every stage, not just at the end.
Active learning is critical here because ethical reasoning is not a formula. Students need to hear from each other, defend positions, and encounter perspectives they had not considered. Structured debate and role-play activities surface the genuine complexity in ways that passive instruction cannot.
Key Questions
- Analyze how biases can be introduced during the problem decomposition phase.
- Justify the importance of considering ethical implications early in the design process.
- Predict the societal impact of a solution that overlooks ethical considerations.
Learning Objectives
- Analyze how implicit assumptions in problem decomposition can lead to biased algorithmic outcomes.
- Evaluate the ethical trade-offs of a proposed technological solution by considering potential harms to specific user groups.
- Design a mitigation strategy to address identified biases in a computational problem-solving process.
- Justify the inclusion of diverse perspectives during the problem-solving lifecycle to prevent unintended negative societal impacts.
Before You Start
Why: Students need a foundational understanding of how problems are broken down and represented computationally before analyzing bias within that process.
Why: Understanding how data is collected and structured is essential for recognizing how biases can be encoded within datasets used for solutions.
Key Vocabulary
| Algorithmic Bias | Systematic and repeatable errors in a computer system that create unfair outcomes, such as privileging one arbitrary group of users over others. |
| Problem Decomposition | The process of breaking down a complex problem into smaller, more manageable sub-problems. This stage can introduce bias through the choices made about what data is considered relevant or irrelevant. |
| Societal Impact | The effect of a technology or solution on the structure, behavior, and values of a society, including both intended and unintended consequences. |
| Fairness in AI | The principle that artificial intelligence systems should treat all individuals and groups equitably, avoiding discrimination or prejudice in their decision-making processes. |
Watch Out for These Misconceptions
Common MisconceptionEthical issues in computing only matter after the product is shipped.
What to Teach Instead
Ethical problems are far harder to fix after deployment. The cost of retraining a biased model or redesigning a flawed system is orders of magnitude higher than addressing bias in the design phase. Role-play activities that put students in the shoes of affected stakeholders make the early-design stakes concrete.
Common MisconceptionAlgorithms are neutral because they follow math, not opinions.
What to Teach Instead
Algorithms reflect the choices of the people who built them: what data to use, what outcome to optimize for, what edge cases to handle. Group analysis of real-world algorithmic failures shows students the human decisions embedded in every line of code.
Active Learning Ideas
See all activitiesFishbowl Discussion: Algorithmic Bias Case Study
The inner circle of four or five students debates a real case of algorithmic bias, such as Amazon's resume-screening tool that downgraded resumes mentioning 'women's.' The outer circle observes and takes notes on the reasoning used. Groups then swap roles and continue the discussion.
Role-Play: Stakeholder Mapping
Groups of four each receive a different stakeholder card (student, parent, teacher, school administrator, student with a disability) for the same AI-graded homework system. Each person advocates for their stakeholder's perspective, then the group identifies where the design most needs ethical scrutiny.
Think-Pair-Share: Bias Entry Points
Students individually identify three points where bias could enter the design of a recommendation algorithm. They pair up to compare their lists, then contribute to a class-wide map on the whiteboard organized by design phase.
Gallery Walk: Ethical Frameworks
Post four posters representing different ethical frameworks: utilitarian, rights-based, fairness and equity, and care ethics. Students rotate and write one computing example for each framework, then the class discusses which framework is most commonly applied in industry and which is most often ignored.
Real-World Connections
- Facial recognition software used by law enforcement agencies has shown documented disparities in accuracy across different racial and gender groups, leading to wrongful accusations.
- Hiring algorithms designed to screen resumes have been found to perpetuate historical gender biases by favoring candidates with profiles similar to previously successful, often male, employees.
- Social media platforms use algorithms to curate content, which can inadvertently create echo chambers or amplify misinformation, impacting public discourse and individual perceptions.
Assessment Ideas
Present students with a scenario: 'A city wants to use an AI system to predict where to allocate resources for after-school programs. What are two potential biases that could be introduced during problem decomposition, and who might be negatively impacted?' Facilitate a class discussion on their responses.
Provide students with a brief description of a hypothetical app designed to help people find local volunteer opportunities. Ask them to identify one ethical consideration and one potential bias, and write one sentence explaining why it matters for the app's users.
Students write down one question they would ask a software developer to ensure their product is ethically designed. They should also explain in one sentence why asking this question is important for preventing bias.
Frequently Asked Questions
How can bias get into a computer program?
What does it mean to consider ethics during problem decomposition?
Why is it important for 9th graders to think about ethics in computing?
How does active learning help students engage with computing ethics?
More in Computational Thinking and Problem Solving
Problem Decomposition Strategies
Students will practice breaking down large problems into manageable sub-problems using various techniques.
2 methodologies
Identifying and Applying Patterns
Students will identify recurring themes across different scenarios and apply known solutions.
2 methodologies
Flowcharts and Pseudocode for Logic
Students will create step-by-step instructions using flowcharts and pseudocode to solve logical puzzles.
2 methodologies
Algorithm Efficiency and Correctness
Students will analyze different algorithmic approaches to the same problem, focusing on efficiency and correctness.
2 methodologies
Identifying and Debugging Logic Errors
Students will learn to identify and correct logic errors in algorithms before writing code.
2 methodologies
Levels of Abstraction in Computing
Students will explore how abstraction reduces complexity by hiding unnecessary details in computing systems.
2 methodologies