Skip to content
Computer Science · Grade 10

Active learning ideas

Machine Learning Basics

Active learning transforms abstract machine learning concepts into concrete experiences students can touch, sort, and discuss. When students physically manipulate data or debate algorithm choices, they build durable mental models of how predictions actually work. This hands-on approach counters the common misconception that models 'think' like humans by letting learners see predictions emerge directly from examples.

Ontario Curriculum ExpectationsCS.HS.D.9CS.HS.D.10
20–40 minPairs → Whole Class4 activities

Activity 01

Flipped Classroom25 min · Pairs

Pairs Sort: Learning Type Scenarios

Provide cards describing real-world tasks, such as 'predict house prices from sizes and locations' or 'group songs by listener habits'. Pairs sort cards into supervised or unsupervised piles and write one-sentence justifications for each. Follow with whole-class share-out to refine categories.

Differentiate between supervised and unsupervised machine learning.

Facilitation TipFor Pairs Sort, provide sticky notes so pairs can physically move scenario cards between supervised and unsupervised columns, forcing verbal reasoning as they place each card.

What to look forProvide students with two scenarios: one describing a system that predicts house prices based on square footage and number of bedrooms, and another describing a system that groups customers by shopping habits. Ask students to identify which scenario uses supervised learning and which uses unsupervised learning, and to briefly explain why.

UnderstandApplyAnalyzeSelf-ManagementSelf-Awareness
Generate Complete Lesson

Activity 02

Flipped Classroom35 min · Small Groups

Small Groups: Mock Training Data

Give groups a simple dataset, like animal features without labels. First, have them predict categories intuitively, then add labels for supervised practice and cluster without for unsupervised. Groups compare prediction accuracy and discuss training data impact.

Analyze simple examples of how machine learning algorithms make predictions.

Facilitation TipDuring Mock Training Data, circulate with colored pens so you can quickly sketch or annotate student datasets on the board to highlight patterns or gaps in their labeling.

What to look forPresent students with a small, simplified dataset (e.g., fruit images labeled 'apple' or 'orange'). Ask them to explain what 'training data' means in this context and how they would use it to teach a computer to identify apples. Then, ask them to describe a scenario where they might use unlabeled data to find patterns in fruit types.

UnderstandApplyAnalyzeSelf-ManagementSelf-Awareness
Generate Complete Lesson

Activity 03

Flipped Classroom20 min · Individual

Individual: Prediction Journal

Students receive printed examples of input data and model outputs. Individually, they journal how changing one training example alters predictions, then pair up to verify entries. Collect journals for feedback.

Explain the role of training data in machine learning models.

Facilitation TipIn the Prediction Journal, model the first entry yourself to show how to connect dataset choices to prediction outcomes before students write independently.

What to look forPose the question: 'Imagine you are building a spam email filter. What kind of data would you need for training, and would this be supervised or unsupervised learning? Explain your reasoning.' Facilitate a class discussion where students share their answers and justify their choices.

UnderstandApplyAnalyzeSelf-ManagementSelf-Awareness
Generate Complete Lesson

Activity 04

Flipped Classroom40 min · Whole Class

Whole Class: Visual Algorithm Demo

Use slides or free online tools like Teachable Machine to demo live predictions. Class votes on inputs, observes model updates with new training data, and notes supervised versus unsupervised shifts.

Differentiate between supervised and unsupervised machine learning.

Facilitation TipSet a two-minute timer during the Visual Algorithm Demo to keep students focused on observing the algorithm’s steps rather than tuning out during a live demo.

What to look forProvide students with two scenarios: one describing a system that predicts house prices based on square footage and number of bedrooms, and another describing a system that groups customers by shopping habits. Ask students to identify which scenario uses supervised learning and which uses unsupervised learning, and to briefly explain why.

UnderstandApplyAnalyzeSelf-ManagementSelf-Awareness
Generate Complete Lesson

A few notes on teaching this unit

Start with small, tangible datasets so students grasp that models learn from examples, not reasoning. Avoid analogies about 'computer brains' because they reinforce the misconception that models understand context. Use side-by-side comparisons of supervised and unsupervised tasks so students notice how labels change the game. Research shows students learn best when they manipulate data themselves, so prioritize activities where they curate, clean, or sort data rather than just watch a simulation.

Students will confidently distinguish supervised and unsupervised learning by explaining the role of labeled data versus pattern discovery. They will articulate why data quality matters and how training data shapes a model’s predictions. Successful learning appears when students justify choices using dataset examples from the activities.


Watch Out for These Misconceptions

  • During Pairs Sort, listen for students to claim that a system 'knows' the difference between apples and oranges because it 'understands' the fruit.

    Interrupt with a concrete redirect: have the pair circle the labels on their scenario cards and ask, 'If the labels were removed, could the system still make the prediction? Why or why not?' to refocus on statistical patterns.

  • During Mock Training Data, watch for groups to assume unsupervised learning is 'inferior' because it lacks labels.

    Ask each group to explain what their unlabeled clusters reveal that labels might have hidden, then have them compare results with another group that added labels to similar data.

  • During Prediction Journal, note students who treat any dataset as equally valid for training.

    Prompt them to circle a data point in their journal and ask, 'What if this example were missing key features? How would that change your prediction?' to highlight the impact of data quality.


Methods used in this brief