Skip to content

Human Rights and TechnologyActivities & Teaching Strategies

This topic asks students to move beyond abstract definitions of human rights and see them as lived realities shaped by the technology they use every day. Active learning works because students need to confront their own digital habits, analyze real-world cases, and debate trade-offs to grasp that rights are not fixed but negotiated through design choices and corporate policies.

2nd YearActive Citizenship and the Democratic State4 activities30 min50 min

Learning Objectives

  1. 1Analyze how specific digital platforms, such as social media or messaging apps, can be used to either promote or restrict freedom of expression.
  2. 2Evaluate the ethical considerations for companies collecting user data, considering privacy rights and potential misuse.
  3. 3Compare the impact of government censorship versus corporate content moderation on the dissemination of information online.
  4. 4Predict potential human rights challenges that may arise from the development and deployment of artificial intelligence technologies like facial recognition.

Want a complete lesson plan with these objectives? Generate a Mission

35 min·Pairs

Debate Pairs: Surveillance Trade-offs

Pair students to debate pros and cons of surveillance for security versus privacy rights, using prepared evidence cards. Switch sides after 10 minutes and note shifts in perspective. End with whole-class synthesis of common ground.

Prepare & details

Analyze how digital technologies can both promote and threaten human rights.

Facilitation Tip: During Surveillance Trade-offs, assign pairs with opposing roles (e.g., government vs. activist) to force concrete argument-building rather than vague opinions.

Setup: Chairs arranged in two concentric circles

Materials: Discussion question/prompt (projected), Observation rubric for outer circle

AnalyzeEvaluateCreateSocial AwarenessRelationship Skills
50 min·Small Groups

Jigsaw: Real-World Censorship

Assign small groups one case, like social media blocks in protests or TikTok data scandals. Groups research key facts, impacts on rights, and solutions, then rotate to teach peers. Compile a class chart of patterns.

Prepare & details

Evaluate the ethical implications of data collection and surveillance.

Facilitation Tip: For Real-World Censorship, assign each group a unique case and require them to present the context, the censorship mechanism, and the human rights impact to the class.

Setup: Flexible seating for regrouping

Materials: Expert group reading packets, Note-taking template, Summary graphic organizer

UnderstandAnalyzeEvaluateRelationship SkillsSelf-Management
30 min·Individual

Digital Footprint Mapping: Individual Audit

Students list their online activities and data shared across apps. Individually map potential risks to privacy rights, then share anonymized examples in small groups for peer feedback on mitigation steps.

Prepare & details

Predict future human rights challenges arising from emerging technologies.

Facilitation Tip: In Digital Footprint Mapping, provide a template with clear categories (e.g., apps used, permissions granted, data shared) to guide students toward specific evidence.

Setup: Chairs arranged in two concentric circles

Materials: Discussion question/prompt (projected), Observation rubric for outer circle

AnalyzeEvaluateCreateSocial AwarenessRelationship Skills
40 min·Whole Class

Future Tech Prediction Wall: Whole Class

Brainstorm emerging tech like AI companions or metaverses. Post sticky notes on rights impacts, then vote and discuss top predictions as a class to form action recommendations.

Prepare & details

Analyze how digital technologies can both promote and threaten human rights.

Facilitation Tip: On the Future Tech Prediction Wall, use a visible timeline to show how predictions evolve as new evidence emerges during the activity.

Setup: Chairs arranged in two concentric circles

Materials: Discussion question/prompt (projected), Observation rubric for outer circle

AnalyzeEvaluateCreateSocial AwarenessRelationship Skills

Teaching This Topic

Approach this topic by grounding abstract rights like privacy and expression in students' lived experiences with social media, messaging apps, and gaming platforms. Avoid starting with lectures on principles; instead, let students uncover the complexities through structured debates, audits, and case studies. Research shows that when students analyze real cases (e.g., platform censorship, data breaches) and connect them to their own data footprints, their understanding of rights shifts from theoretical to practical and personal.

What to Expect

Successful learning looks like students recognizing technology as both a tool for justice and a potential threat to rights, articulating nuanced arguments using evidence from discussions or audits, and developing a critical lens they apply to their own online practices. They should leave able to identify multiple actors (governments, corporations, users) and explain their roles in either protecting or violating rights.

These activities are a starting point. A full mission is the experience.

  • Complete facilitation script with teacher dialogue
  • Printable student materials, ready for class
  • Differentiation strategies for every learner
Generate a Mission

Watch Out for These Misconceptions

Common MisconceptionDuring Digital Footprint Mapping, watch for students who assume all data sharing is voluntary.

What to Teach Instead

Use the audit template to prompt students to list permissions they granted unknowingly (e.g., location tracking turned on by default) and discuss how design choices shape consent.

Common MisconceptionDuring Surveillance Trade-offs, watch for students who frame all surveillance as inherently negative.

What to Teach Instead

Require them to present both harms (e.g., chilling effect on protest) and benefits (e.g., crime prevention) using evidence from assigned roles and real-world examples.

Common MisconceptionDuring Real-World Censorship, watch for students who attribute censorship solely to governments.

What to Teach Instead

Use the jigsaw structure to assign each group a platform or corporate policy (e.g., TikTok’s moderation guidelines) and have them present the corporation’s role in shaping what is censored.

Assessment Ideas

Discussion Prompt

After Surveillance Trade-offs, pose the question: 'If a social media platform removes content it deems harmful, is that censorship or responsible moderation?' Ask students to provide one argument for each side, citing examples from their debate or case studies discussed in class.

Exit Ticket

After Digital Footprint Mapping, provide students with a scenario: 'A new app promises to connect you with friends but requires access to your contacts, location, and microphone.' Ask them to write two sentences explaining a potential privacy risk and one question they would ask the app developers, using evidence from their audit.

Quick Check

During Real-World Censorship, display a list of technologies (e.g., encrypted messaging, smart home devices, online gaming platforms). Ask students to write next to each one whether it primarily presents an opportunity or a threat to human rights, and to briefly explain why, referencing the cases they studied.

Extensions & Scaffolding

  • Challenge early finishers to design a digital rights campaign poster targeting a specific platform or app they use, including evidence from a case study and steps users can take to protect their rights.
  • Scaffolding for struggling students: Provide sentence starters for debates (e.g., 'One risk of this app is... because...') or pre-fill part of the footprint audit with common apps they likely use.
  • Deeper exploration: Invite students to research and present on an underreported case of algorithmic bias or surveillance, connecting it to global citizenship themes like inequality or migration.

Key Vocabulary

Digital PrivacyThe right of individuals to control their personal information when they are online, including what data is collected and how it is used.
CensorshipThe suppression or prohibition of any parts of books, films, news, etc., that are considered obscene, politically unacceptable, or a threat to security, often occurring online.
Algorithmic BiasSystematic and repeatable errors in a computer system that create unfair outcomes, such as favoring one arbitrary group of users over others in content moderation.
SurveillanceThe close observation of a person or group, especially one suspected of wrongdoing, which can occur through digital means like tracking online activity.
Data BreachAn incident where sensitive, protected, or confidential data is copied, transmitted, viewed, stolen, or used by an individual unauthorized to do so.

Ready to teach Human Rights and Technology?

Generate a full mission with everything you need

Generate a Mission