Skip to content
Computer Science · 11th Grade

Active learning ideas

Autonomous Systems and Ethical Dilemmas

Active learning helps students confront the messy realities of ethical decision-making in autonomous systems, where abstract theories meet real-world consequences. By testing ethical frameworks in structured debates, case studies, and design tasks, students move from passive acceptance of rules to active ownership of the values embedded in technology.

Common Core State StandardsCSTA: 3B-IC-26CSTA: 3B-IC-27
30–40 minPairs → Whole Class3 activities

Activity 01

Structured Academic Controversy: Self-Driving Car Decision Rules

Pairs argue that autonomous vehicles should be programmed to minimize total casualties (utilitarian), then switch and argue that the vehicle should always protect its occupant regardless of third-party risk. After both rounds, pairs draft their own proposed decision rule and present the tradeoffs they accepted.

Analyze the ethical dilemmas inherent in the design and deployment of autonomous systems.

Facilitation TipDuring the Structured Academic Controversy, assign roles strictly so students must argue positions they may not personally hold, deepening their engagement with alternative viewpoints.

What to look forPresent students with a scenario: An autonomous delivery drone carrying medicine must choose between landing in a crowded park to save a life or avoiding the crowd and failing its mission. Ask students to debate: Which action is more ethically justifiable? What ethical framework supports their choice? What are the potential negative consequences of each decision?

AnalyzeEvaluateCreateSocial AwarenessRelationship Skills
Generate Complete Lesson

Activity 02

Mock Trial35 min · Small Groups

Case Analysis: Real Autonomous System Incidents

Assign groups one of three documented autonomous system incidents (Uber 2018 fatality, Tesla Autopilot misuse cases, drone strike collateral damage cases). Groups identify the decision point, the ethical question it raised, and what accountability structure was applied, then present to the class.

Justify decision-making frameworks for AI in situations with conflicting values.

Facilitation TipFor the Case Analysis activity, provide students with primary sources like the NTSB report on the Uber incident to ground their discussions in evidence rather than speculation.

What to look forProvide students with a short case study about an autonomous medical diagnostic tool that shows a slight bias against a specific demographic. Ask them to identify the type of ethical issue presented and suggest two concrete steps engineers could take to mitigate this bias.

AnalyzeEvaluateCreateDecision-MakingSocial Awareness
Generate Complete Lesson

Activity 03

Mock Trial30 min · Small Groups

Design Challenge: Ethical Guidelines for Autonomous Drones

Groups are given a scenario where a delivery company wants to deploy autonomous drones in a mixed residential and commercial area. Teams draft three ethical guidelines for the system's behavior, anticipate failure scenarios, and present their guidelines for class critique.

Predict the legal and societal implications of increasing autonomy in machines.

Facilitation TipIn the Design Challenge, require students to document their ethical guidelines in a decision-tree format before prototyping, ensuring their reasoning is explicit and testable.

What to look forAsk students to write down one key difference between a deontological and a utilitarian approach to programming an autonomous vehicle in an unavoidable crash. Then, have them briefly explain which approach they find more compelling and why.

AnalyzeEvaluateCreateDecision-MakingSocial Awareness
Generate Complete Lesson

A few notes on teaching this unit

Teachers should frame ethical dilemmas as design problems, not philosophical puzzles. Research shows that students grasp ethical complexity better when they see how engineers encode values into algorithms, such as through loss functions in machine learning. Avoid presenting ethics as a binary of 'right vs. wrong'—instead, emphasize the trade-offs between competing principles like safety, fairness, and accountability. Use real incidents to anchor discussions, but balance their dramatic appeal with technical constraints, such as sensor limitations or computational trade-offs.

Successful learning looks like students recognizing that ethical dilemmas in autonomous systems are not about finding perfect answers but about making transparent, defensible choices. They should be able to articulate trade-offs, identify stakeholder impacts, and connect technical constraints to ethical principles.


Watch Out for These Misconceptions

  • During the Structured Academic Controversy activity, watch for students assuming that removing human emotion from autonomous systems ensures ethical perfection.

    Use the Structured Academic Controversy to redirect this idea: Have students examine the decision rules provided by engineers in the Uber case study. Ask them to identify where value judgments (e.g., prioritizing passenger safety over pedestrian safety) were embedded in the system’s design, even without human operators.

  • During the Case Analysis activity, watch for students assuming manufacturers are always legally responsible when autonomous systems cause harm.

    Use the Case Analysis activity to address this: After reviewing the NTSB report on the Uber incident, ask students to map the chain of liability—from the safety driver to the manufacturer to the city—and discuss how international law on autonomous weapons remains unsettled as highlighted in the case studies.

  • During the Design Challenge activity, watch for students focusing only on trolley-problem style dilemmas for their autonomous drones.

    Use the Design Challenge to shift focus: Require students to document three ethical risks beyond edge cases: systematic biases in their training data, cybersecurity vulnerabilities, and equitable access to the drone’s services. Provide examples like pedestrian detection biases to guide their analysis.


Methods used in this brief