Skip to content

Autonomous Systems and Ethical DilemmasActivities & Teaching Strategies

Active learning helps students confront the messy realities of ethical decision-making in autonomous systems, where abstract theories meet real-world consequences. By testing ethical frameworks in structured debates, case studies, and design tasks, students move from passive acceptance of rules to active ownership of the values embedded in technology.

11th GradeComputer Science3 activities30 min40 min

Learning Objectives

  1. 1Analyze the ethical trade-offs inherent in programming autonomous vehicles to respond to unavoidable accident scenarios.
  2. 2Compare and contrast utilitarian and deontological ethical frameworks as applied to AI decision-making in autonomous systems.
  3. 3Design a set of decision-making criteria for a hypothetical autonomous system, justifying choices based on ethical principles.
  4. 4Evaluate the potential legal consequences of deploying autonomous systems that make life-or-death decisions.
  5. 5Synthesize arguments for and against the development of lethal autonomous weapons systems.

Want a complete lesson plan with these objectives? Generate a Mission

Structured Academic Controversy: Self-Driving Car Decision Rules

Pairs argue that autonomous vehicles should be programmed to minimize total casualties (utilitarian), then switch and argue that the vehicle should always protect its occupant regardless of third-party risk. After both rounds, pairs draft their own proposed decision rule and present the tradeoffs they accepted.

Prepare & details

Analyze the ethical dilemmas inherent in the design and deployment of autonomous systems.

Facilitation Tip: During the Structured Academic Controversy, assign roles strictly so students must argue positions they may not personally hold, deepening their engagement with alternative viewpoints.

Setup: Pairs of desks facing each other

Materials: Position briefs (both sides), Note-taking template, Consensus statement template

AnalyzeEvaluateCreateSocial AwarenessRelationship Skills
35 min·Small Groups

Case Analysis: Real Autonomous System Incidents

Assign groups one of three documented autonomous system incidents (Uber 2018 fatality, Tesla Autopilot misuse cases, drone strike collateral damage cases). Groups identify the decision point, the ethical question it raised, and what accountability structure was applied, then present to the class.

Prepare & details

Justify decision-making frameworks for AI in situations with conflicting values.

Facilitation Tip: For the Case Analysis activity, provide students with primary sources like the NTSB report on the Uber incident to ground their discussions in evidence rather than speculation.

Setup: Desks rearranged into courtroom layout

Materials: Role cards, Evidence packets, Verdict form for jury

AnalyzeEvaluateCreateDecision-MakingSocial Awareness
30 min·Small Groups

Design Challenge: Ethical Guidelines for Autonomous Drones

Groups are given a scenario where a delivery company wants to deploy autonomous drones in a mixed residential and commercial area. Teams draft three ethical guidelines for the system's behavior, anticipate failure scenarios, and present their guidelines for class critique.

Prepare & details

Predict the legal and societal implications of increasing autonomy in machines.

Facilitation Tip: In the Design Challenge, require students to document their ethical guidelines in a decision-tree format before prototyping, ensuring their reasoning is explicit and testable.

Setup: Desks rearranged into courtroom layout

Materials: Role cards, Evidence packets, Verdict form for jury

AnalyzeEvaluateCreateDecision-MakingSocial Awareness

Teaching This Topic

Teachers should frame ethical dilemmas as design problems, not philosophical puzzles. Research shows that students grasp ethical complexity better when they see how engineers encode values into algorithms, such as through loss functions in machine learning. Avoid presenting ethics as a binary of 'right vs. wrong'—instead, emphasize the trade-offs between competing principles like safety, fairness, and accountability. Use real incidents to anchor discussions, but balance their dramatic appeal with technical constraints, such as sensor limitations or computational trade-offs.

What to Expect

Successful learning looks like students recognizing that ethical dilemmas in autonomous systems are not about finding perfect answers but about making transparent, defensible choices. They should be able to articulate trade-offs, identify stakeholder impacts, and connect technical constraints to ethical principles.

These activities are a starting point. A full mission is the experience.

  • Complete facilitation script with teacher dialogue
  • Printable student materials, ready for class
  • Differentiation strategies for every learner
Generate a Mission

Watch Out for These Misconceptions

Common MisconceptionDuring the Structured Academic Controversy activity, watch for students assuming that removing human emotion from autonomous systems ensures ethical perfection.

What to Teach Instead

Use the Structured Academic Controversy to redirect this idea: Have students examine the decision rules provided by engineers in the Uber case study. Ask them to identify where value judgments (e.g., prioritizing passenger safety over pedestrian safety) were embedded in the system’s design, even without human operators.

Common MisconceptionDuring the Case Analysis activity, watch for students assuming manufacturers are always legally responsible when autonomous systems cause harm.

What to Teach Instead

Use the Case Analysis activity to address this: After reviewing the NTSB report on the Uber incident, ask students to map the chain of liability—from the safety driver to the manufacturer to the city—and discuss how international law on autonomous weapons remains unsettled as highlighted in the case studies.

Common MisconceptionDuring the Design Challenge activity, watch for students focusing only on trolley-problem style dilemmas for their autonomous drones.

What to Teach Instead

Use the Design Challenge to shift focus: Require students to document three ethical risks beyond edge cases: systematic biases in their training data, cybersecurity vulnerabilities, and equitable access to the drone’s services. Provide examples like pedestrian detection biases to guide their analysis.

Assessment Ideas

Discussion Prompt

After the Structured Academic Controversy activity, present students with the scenario of an autonomous delivery drone carrying medicine. Ask them to debate the ethical justifiability of landing in a crowded park versus avoiding the crowd. Assess their ability to articulate the ethical framework they used (e.g., utilitarian vs. deontological) and the potential consequences of each decision.

Quick Check

During the Case Analysis activity, provide students with a short case study about an autonomous medical diagnostic tool showing bias against a specific demographic. Ask them to identify the ethical issue (e.g., data bias) and suggest two concrete steps engineers could take to mitigate it. Assess their responses for specificity and connection to real-world practices like dataset audits.

Exit Ticket

After the Design Challenge activity, ask students to write down one key difference between a deontological and a utilitarian approach to programming an autonomous vehicle in an unavoidable crash. Then, have them briefly explain which approach they find more compelling and why. Collect these to assess their understanding of core ethical frameworks and their ability to apply them to a technical scenario.

Extensions & Scaffolding

  • Challenge: Ask students to research and present a real-world policy or industry standard for autonomous systems (e.g., ISO 26262, EU AI Act) and evaluate how well it addresses the ethical dilemmas from the design challenge.
  • Scaffolding: For students struggling with abstract ethical frameworks, provide a scaffolded worksheet that maps deontological and utilitarian principles to concrete decision points in their drone design.
  • Deeper exploration: Invite a local ethicist or engineer working on autonomous systems to join the class for a Q&A, focusing on how they navigate ethical dilemmas in their daily work.

Key Vocabulary

Algorithmic biasSystematic and repeatable errors in a computer system that create unfair outcomes, such as prioritizing certain groups over others.
Trolley problemA thought experiment in ethics where a person must choose between allowing a trolley to kill several people or diverting it to kill one person.
Lethal Autonomous Weapons Systems (LAWS)Weapons systems that can independently search for, identify, decide to engage, and engage targets without direct human intervention.
DeontologyAn ethical theory that judges the morality of an action based on rules or duties, emphasizing that some actions are inherently right or wrong regardless of consequences.
UtilitarianismAn ethical theory that holds that the best action is the one that maximizes utility, often defined as maximizing happiness and minimizing suffering for the greatest number of people.

Ready to teach Autonomous Systems and Ethical Dilemmas?

Generate a full mission with everything you need

Generate a Mission