Skip to content

Fundamentals of Machine Learning: Supervised LearningActivities & Teaching Strategies

Supervised learning can feel abstract to students, but active learning turns the theoretical into the tangible. Labs, discussions, and comparisons help students see how labeled data guides a model’s decisions, making the ‘supervised’ process visible and meaningful.

12th GradeComputer Science4 activities15 min45 min

Learning Objectives

  1. 1Compare and contrast classification and regression tasks within supervised machine learning.
  2. 2Explain the fundamental process of training a supervised learning model using labeled data.
  3. 3Evaluate the performance of a trained supervised learning model using appropriate metrics.
  4. 4Design a simple supervised learning experiment to predict a categorical or numerical outcome.

Want a complete lesson plan with these objectives? Generate a Mission

45 min·Pairs

Hands-On Lab: Train Your First Classifier

Students use Google's Teachable Machine or a simple scikit-learn notebook to train an image or text classifier on a dataset they collect themselves. They deliberately include mislabeled examples and observe how this degrades accuracy. The lab closes with each pair reporting their accuracy and one insight about what made their training data better or worse.

Prepare & details

How does a machine learning model differ from a traditional rule-based program?

Facilitation Tip: During Train Your First Classifier, circulate to ensure students are labeling data correctly before training, as errors here propagate through the entire process.

Setup: Flexible space for group stations

Materials: Role cards with goals/resources, Game currency or tokens, Round tracker

ApplyAnalyzeEvaluateCreateSocial AwarenessDecision-Making
15 min·Pairs

Think-Pair-Share: Classification or Regression?

Present eight real-world prediction problems and ask pairs to categorize each as classification or regression and justify the choice. Include ambiguous cases like predicting customer satisfaction (score 1-10 versus positive/negative). Whole-class discussion reveals that the distinction sometimes depends on how you frame the business problem, not just the data.

Prepare & details

Differentiate between classification and regression tasks in supervised learning.

Facilitation Tip: For Classification or Regression?, give students one minute to jot down their individual thoughts before pairing, ensuring all voices contribute to the discussion.

Setup: Standard classroom seating; students turn to a neighbor

Materials: Discussion prompt (projected or printed), Optional: recording sheet for pairs

UnderstandApplyAnalyzeSelf-AwarenessRelationship Skills
25 min·Whole Class

Socratic Seminar: What Does 'Learning' Mean?

Open with the question: 'Is a model that scores 99% accuracy on training data but 60% on new data actually learning?' Students draw on their lab experience to discuss generalization, memorization, and the purpose of the train/test split. The teacher facilitates without providing answers, letting student reasoning drive the conversation toward overfitting.

Prepare & details

Explain the process of training and evaluating a supervised learning model.

Facilitation Tip: In Socratic Seminar, step back after posing a question and let silence linger to give students space to formulate responses based on the readings and labs.

Setup: Chairs arranged in two concentric circles

Materials: Discussion question/prompt (projected), Observation rubric for outer circle

AnalyzeEvaluateCreateSocial AwarenessRelationship Skills
20 min·Small Groups

Gallery Walk: Algorithm Comparison

Post four posters around the room, linear regression, decision trees, k-nearest neighbors, and naive Bayes, each with a brief description, a sample use case, and a blank section labeled 'when this would struggle.' Groups rotate, add sticky notes to the struggle section, then rotate again to critique and extend each other's entries.

Prepare & details

How does a machine learning model differ from a traditional rule-based program?

Facilitation Tip: During Gallery Walk, set a two-minute timer per station so students stay on task and absorb comparisons efficiently.

Setup: Wall space or tables arranged around room perimeter

Materials: Large paper/poster boards, Markers, Sticky notes for feedback

UnderstandApplyAnalyzeCreateRelationship SkillsSocial Awareness

Teaching This Topic

Focus on concrete examples and hands-on work rather than abstract formulas. Research shows students grasp the role of labels best when they create or curate their own datasets. Avoid diving too deep into algorithm mechanics early on; prioritize understanding the data-model relationship first. Use frequent low-stakes checks to catch misconceptions before they solidify.

What to Expect

Students will leave able to explain how labeled data trains a model, distinguish between classification and regression tasks, and articulate why evaluation on unseen data matters. They will also recognize common pitfalls like overfitting and misinterpreting model capabilities.

These activities are a starting point. A full mission is the experience.

  • Complete facilitation script with teacher dialogue
  • Printable student materials, ready for class
  • Differentiation strategies for every learner
Generate a Mission

Watch Out for These Misconceptions

Common MisconceptionDuring Train Your First Classifier, watch for students assuming more training examples automatically improve the model.

What to Teach Instead

Interrupt the lab to have students deliberately introduce errors into a portion of the training labels. After retraining, ask them to compare performance on a held-out test set and observe how data quality, not just quantity, drives results.

Common MisconceptionDuring Train Your First Classifier, watch for students equating high training accuracy with a good model.

What to Teach Instead

Have students train on the full dataset and then test on the same data, noting the near-perfect accuracy. Next, give them a separate test set and ask them to explain why performance drops, linking this to overfitting.

Common MisconceptionDuring Socratic Seminar, watch for students attributing understanding or meaning to the model’s decisions.

What to Teach Instead

Use the discussion to contrast the model’s statistical patterns with human understanding. Ask students to describe what the model ‘knows’ about the data versus what a human would know, grounding the conversation in their lab results.

Assessment Ideas

Exit Ticket

After Classification or Regression?, ask students to identify whether predicting house prices or identifying cat/dog images is classification or regression, and explain their reasoning in 2-3 sentences.

Quick Check

During Train Your First Classifier, ask students to explain aloud how they used the labels to train their model, focusing on the role of the labels in guiding the learning process.

Discussion Prompt

After Gallery Walk, pose the question: 'Why is it crucial to evaluate a supervised learning model on data it has not seen during training?' Facilitate a discussion where students explain overfitting and the importance of the test set for real-world performance.

Extensions & Scaffolding

  • Challenge: Ask students to design a new labeled dataset (e.g., predicting student grades based on study hours) and train a model, then write a reflection on how they ensured data quality.
  • Scaffolding: Provide a partially labeled dataset and ask students to complete the labeling process before training, discussing how missing labels might affect the model.
  • Deeper exploration: Have students research and present on how biases in training data can lead to unfair models, connecting back to their hands-on lab experiences.

Key Vocabulary

Labeled DataA dataset where each data point is paired with a correct output or 'label', used to train supervised learning models.
ClassificationA supervised learning task that predicts a discrete category or class label, such as 'spam' or 'not spam'.
RegressionA supervised learning task that predicts a continuous numerical value, such as a house price or temperature.
Training SetThe portion of labeled data used to teach the machine learning model by adjusting its parameters.
Test SetA separate portion of labeled data, unseen during training, used to evaluate the model's generalization ability.

Ready to teach Fundamentals of Machine Learning: Supervised Learning?

Generate a full mission with everything you need

Generate a Mission