- Start: Monday, October 20
- End: Saturday, October 25
Summary
This week, we will look at two extensions of decision trees: random forests and boosted models. These are both ensemble methods that combine the predictions of many trees to improve model performance.
Learning Objectives
After completing this week, you are expected to be able to:
- Understand how averaging the predictions from many trees (for example using a random forest) can improve model performance.
- Use a random forest to perform regression and classification.
- Use boosting to perform regression and classification.
Topics
- Ensemble Methods
- Random Forests
- Boosted Models
Activities
Upcoming Deadlines
2025-10-25
- Assessment: Homework 072025-10-25
- Assessment: Lab Model 03