- Start: Monday, September 22
- End: Saturday, September 27
Summary
This week, we will introduce another nonparametric method for supervised learning: decision trees.
Learning Objectives
After completing this week, you are expected to be able to:
- Understand how decision trees differ from KNN when determining similarity of data.
- Find and evaluate decision tree splits for regression.
- Find and evaluate decision tree splits for classification.
- Use decision trees to make predictions for regression tasks using
sklearn
. - Use decision trees to make predictions for classification tasks using
sklearn
. - Use decision trees to estimate conditional probabilities for classification tasks using
sklearn
. - Tune the parameters of decision trees to avoid overfitting.
Topics
- Regression Trees
- Classification Trees