Skip to content
CCE · Secondary 2 · Global Citizenship and Future Challenges · Semester 2

Technological Advancements and Ethics

Discussing the ethical implications of emerging technologies like AI and biotechnology on society and governance.

MOE Syllabus OutcomesMOE: Moral Reasoning and Ethics - S2MOE: Global Awareness - S2

About This Topic

Technological Advancements and Ethics guides Secondary 2 students to examine the moral questions surrounding AI, biotechnology, and automation. They analyze how these innovations affect privacy, jobs, equity, and governance. Key discussions include ethical challenges from rapid change, predictions on AI's societal impacts, and government's duty to regulate for public benefit. Students practice moral reasoning by weighing benefits against risks, such as AI bias or genetic editing dilemmas.

This topic fits MOE CCE standards for Moral Reasoning and Ethics, and Global Awareness at Secondary 2. It builds skills in critical evaluation and civic responsibility within the Global Citizenship and Future Challenges unit. Students connect personal values to broader societal implications, preparing them for informed participation in a tech-driven world.

Active learning suits this topic well. Role-plays and debates let students embody stakeholders, making ethics personal and vivid. Collaborative case studies reveal diverse viewpoints, while structured reflections solidify ethical decision-making through peer dialogue and evidence-based arguments.

Key Questions

  1. Analyze the ethical challenges posed by rapid technological advancements.
  2. Predict the societal impacts of artificial intelligence and automation.
  3. Evaluate the role of government in regulating emerging technologies for public good.

Learning Objectives

  • Analyze the ethical dilemmas presented by AI-driven decision-making in areas like hiring or loan applications.
  • Evaluate the potential societal impacts of widespread automation on employment sectors and economic inequality.
  • Compare the ethical frameworks used to guide the development of biotechnology, such as gene editing.
  • Propose governance strategies for regulating emerging technologies to ensure public good and mitigate risks.
  • Critique the balance between technological innovation and individual privacy rights in digital surveillance.

Before You Start

Introduction to Digital Citizenship

Why: Students need a foundational understanding of responsible online behavior and digital footprints before exploring the ethical implications of advanced technologies.

Basic Principles of Governance

Why: Understanding how societies are organized and governed provides context for discussing the role of government in regulating new technologies.

Key Vocabulary

Artificial Intelligence (AI)Computer systems designed to perform tasks that typically require human intelligence, such as learning, problem-solving, and decision-making.
BiotechnologyThe use of living organisms or their products to develop new technologies and products, often applied in medicine and agriculture.
AutomationThe use of technology to perform tasks with minimal human intervention, often involving robots or software.
Algorithmic BiasSystematic and repeatable errors in a computer system that create unfair outcomes, such as privileging one arbitrary group of users over others.
Gene EditingA group of technologies that give scientists the ability to change an organism's DNA, allowing them to add, remove, or alter genetic material at particular locations in the genome.

Watch Out for These Misconceptions

Common MisconceptionTechnology always benefits society without harm.

What to Teach Instead

Students often overlook trade-offs like job loss or privacy erosion. Group debates expose these by requiring evidence from both sides, helping them adopt balanced views. Peer challenges during discussions reveal hidden costs.

Common MisconceptionAI and biotech are neutral tools with no ethics needed.

What to Teach Instead

Many assume tech lacks inherent bias or moral weight. Role-plays as affected parties show how design choices embed values, like biased algorithms. This active empathy-building corrects views through lived scenarios.

Common MisconceptionGovernment regulation stifles innovation completely.

What to Teach Instead

Teens may see rules as barriers only. Simulations of regulation processes demonstrate protections alongside innovation incentives. Collaborative negotiation helps students value balanced governance.

Active Learning Ideas

See all activities

Real-World Connections

  • The development of self-driving cars by companies like Waymo and Tesla raises ethical questions about accident liability and the programming of 'trolley problem' scenarios.
  • The use of facial recognition technology by law enforcement agencies in cities like London and New York prompts debates about privacy, surveillance, and potential for misidentification.
  • CRISPR technology, a powerful gene editing tool, is being explored for treating genetic diseases like sickle cell anemia, but also raises concerns about designer babies and unintended ecological consequences.

Assessment Ideas

Discussion Prompt

Pose the following scenario: 'An AI system is developed that can predict a student's likelihood of dropping out of school with 90% accuracy. The school wants to use this to offer early interventions. What are the ethical benefits and risks of using this AI? Who should have access to this prediction data?' Facilitate a class debate on these questions.

Exit Ticket

Ask students to write down one emerging technology discussed in class. Then, have them list one potential societal benefit and one potential ethical challenge associated with it. Finally, ask them to suggest one role for the government in managing this technology.

Quick Check

Present students with short case studies (e.g., a company using AI to monitor employee productivity, a biotech firm developing drought-resistant crops). Ask them to identify the primary ethical issue in each case and briefly explain why it is a concern.

Frequently Asked Questions

How can teachers introduce ethics of AI in CCE?
Start with relatable scenarios like social media algorithms influencing opinions. Use short videos of real AI cases, followed by think-pair-share to surface initial thoughts. Guide with ethical frameworks like utilitarianism versus rights-based approaches, ensuring discussions link to Singapore's Smart Nation initiatives for local relevance.
What active learning strategies work for tech ethics?
Role-plays, debates, and case study carousels engage students deeply. In role-plays, students represent diverse stakeholders to negotiate biotech policies, fostering empathy and perspective-taking. Debates on AI impacts build argumentation skills, while reflections connect personal ethics to global challenges. These methods make abstract concepts concrete and memorable.
How to address societal impacts of automation in class?
Focus on predictions through data on job shifts in Singapore. Have students map winners and losers in small groups, then propose retraining policies. Link to ethical duties of government and companies, using news articles for evidence. This builds predictive thinking and civic solutions.
What is government's role in regulating emerging tech?
Government balances innovation with public good, as in Singapore's model-based approach. Students evaluate through simulations: protect privacy via data laws, ensure equity in AI deployment, and fund ethical research. Discussions highlight tensions between speed and safety, preparing students for informed citizenship.