Skip to content
Active Citizenship and the Democratic State · 2nd Year · Human Rights and Global Responsibility · Spring Term

Human Rights and Technology

Explore the new challenges and opportunities for human rights in the digital age, including privacy and censorship.

NCCA Curriculum SpecificationsNCCA: Junior Cycle - Rights and ResponsibilitiesNCCA: Junior Cycle - Global Citizenship

About This Topic

Human Rights and Technology guides 2nd Year students through the dual role of digital tools in upholding or undermining rights like privacy and freedom of expression. They examine challenges such as mass surveillance via apps, algorithmic bias in content moderation, and data breaches that expose personal information. Opportunities include global activism platforms that amplify marginalized voices during events like protests. This fits NCCA Junior Cycle emphases on Rights and Responsibilities and Global Citizenship, linking abstract principles to students' daily use of smartphones and social media.

Students address key questions by analyzing how technologies promote rights through connectivity yet threaten them via censorship and unethical data practices. They evaluate surveillance ethics and predict issues from AI, such as facial recognition misuse or deepfake misinformation. These explorations build critical thinking and foresight essential for democratic participation.

Active learning excels with this topic because concepts feel immediate and personal. Role-plays of data scandals, paired debates on privacy laws, or group timelines of tech evolutions turn passive facts into engaged analysis. Students connect rights to their lives, strengthening ethical reasoning and advocacy skills.

Key Questions

  1. Analyze how digital technologies can both promote and threaten human rights.
  2. Evaluate the ethical implications of data collection and surveillance.
  3. Predict future human rights challenges arising from emerging technologies.

Learning Objectives

  • Analyze how specific digital platforms, such as social media or messaging apps, can be used to either promote or restrict freedom of expression.
  • Evaluate the ethical considerations for companies collecting user data, considering privacy rights and potential misuse.
  • Compare the impact of government censorship versus corporate content moderation on the dissemination of information online.
  • Predict potential human rights challenges that may arise from the development and deployment of artificial intelligence technologies like facial recognition.

Before You Start

Introduction to Human Rights

Why: Students need a foundational understanding of basic human rights, such as the right to privacy and freedom of expression, before exploring how technology impacts them.

Digital Literacy Basics

Why: Familiarity with using digital devices and common online platforms is necessary to engage with the topic of technology's role in human rights.

Key Vocabulary

Digital PrivacyThe right of individuals to control their personal information when they are online, including what data is collected and how it is used.
CensorshipThe suppression or prohibition of any parts of books, films, news, etc., that are considered obscene, politically unacceptable, or a threat to security, often occurring online.
Algorithmic BiasSystematic and repeatable errors in a computer system that create unfair outcomes, such as favoring one arbitrary group of users over others in content moderation.
SurveillanceThe close observation of a person or group, especially one suspected of wrongdoing, which can occur through digital means like tracking online activity.
Data BreachAn incident where sensitive, protected, or confidential data is copied, transmitted, viewed, stolen, or used by an individual unauthorized to do so.

Watch Out for These Misconceptions

Common MisconceptionDigital technology always promotes human rights by connecting people.

What to Teach Instead

Technology can enable censorship and surveillance that suppress expression. Group case studies reveal how platforms throttle content, helping students see nuances through shared evidence and discussion.

Common MisconceptionPrivacy is outdated in the digital age since sharing is normal.

What to Teach Instead

Privacy safeguards dignity and prevents exploitation. Personal footprint audits make students confront real risks, fostering peer conversations that clarify its ongoing relevance.

Common MisconceptionOnly governments threaten rights online through censorship.

What to Teach Instead

Corporations wield power via algorithms and terms of service. Jigsaw activities expose platform roles, building collaborative understanding of multiple actors.

Active Learning Ideas

See all activities

Real-World Connections

  • Students can examine the privacy policies of popular apps like TikTok or Instagram, analyzing what personal data is collected and how it is shared with third parties, similar to how the European Union's GDPR regulates data handling.
  • Investigate news reports about government-imposed internet shutdowns during political protests in countries like Iran or Myanmar, connecting these actions to restrictions on freedom of assembly and expression.
  • Discuss the implications of facial recognition technology used by law enforcement agencies in cities like London or New York, considering its potential for both public safety and the violation of privacy rights.

Assessment Ideas

Discussion Prompt

Pose the question: 'If a social media platform removes content it deems harmful, is that censorship or responsible moderation?' Ask students to provide one argument for each side, citing examples discussed in class.

Exit Ticket

Provide students with a scenario: 'A new app promises to connect you with friends but requires access to your contacts, location, and microphone.' Ask them to write two sentences explaining a potential privacy risk and one question they would ask the app developers.

Quick Check

Display a list of technologies (e.g., encrypted messaging, smart home devices, online gaming platforms). Ask students to write next to each one whether it primarily presents an opportunity or a threat to human rights, and to briefly explain why.

Frequently Asked Questions

How does technology threaten human rights like privacy?
Surveillance tools collect vast data without consent, enabling profiling and manipulation, as in Cambridge Analytica's election interference. Censorship algorithms silence dissent, seen in blocks during Hong Kong protests. Students analyze these to grasp erosion of autonomy and expression, connecting to NCCA global citizenship goals.
What active learning strategies work for human rights and technology?
Use debates, role-plays, and digital audits to engage students directly. Pairs debating surveillance build empathy for opposing views, while group jigsaws on cases like data breaches promote deep analysis. These methods link abstract rights to personal tech use, enhancing retention and ethical skills over lectures.
What are ethical issues in data collection and surveillance?
Data practices often lack transparency, leading to bias and discrimination in AI decisions. Surveillance chills free speech as people self-censor. Ethical teaching involves evaluating consent models and regulations like GDPR, helping students predict societal harms and advocate for balanced policies.
How can emerging technologies challenge human rights?
AI deepfakes undermine truth and elections, while biometric surveillance risks mass tracking without recourse. Students predict via brainstorming walls, weighing benefits like crime detection against rights violations. This forward-thinking aligns with NCCA standards, preparing informed citizens.