Schedule
Required readings are listed below for each module. Readings from ISLR are required, while those from ESL (in parentheses) are optional and supplemental.
1 The Learning Procedure - Models, Fitting, Model Selection
Topics: Learning through statistician and algorithmic lenses, model selection; cross validation
Learning Objectives:
- Formulate learning problems in terms of statistical models, estimators, and model selection
- Identify criteria for good statistical models, estimators, and model selection metrics
Handouts and Resources:
- Programming in
R
.Rmd
,.pdf
- Using RMarkdown
.Rmd
,.pdf
- Overview of
R
and Tidyverse (Slides from last term) - Overview of
git
and version control (Slides from last term)
Date | Topic | Readings | Deadlines |
---|---|---|---|
(no class, Imagine UBC) | |||
Sep 4 | Class Overview (slides) Probability Review (notes) |
||
Sep 9 | Introduction to Learning, Regression (notes) |
ISLR 2.1 (ESL 2.4, 2.6) |
|
Sep 11 | Learning (cont.), Classification (notes) |
ISLR 4.3 (ESL 4.4) |
|
Lab 00 (Sep 12) |
|||
Sep 16 | Model Selection, Cross Validation | ISLR 5.1 (ESL 2.9, 7.10) |
2 Bias-Variance Tradeoff, Linear Methods
Topics: bias/variance tradeoff; regularized regression (ridge and lasso); non-linearities via basis functions; advanced model selection and analysis
Learning Objectives:
- Decompose prediction error into bias and variance components
- Implement regularized versions of linear regression (ridge, lasso) and understand their impact on bias and variance
- Implement basis expansions for linear regression and understand their impact on bias and variance
- Apply closed-form selection techniques to linear methods, and identify factors in the formula that affect bias and variance
Date | Topic | Readings | Deadlines |
---|---|---|---|
Sep 18 | Bias-Variance Tradeoff | ISLR 2.2 (ESL 7.1-7.3) |
|
Lab 01 (Sep 19) |
|||
Sep 23 | Ridge Regression | ISLR 6.2.1 (ESL 3.4.0-3.4.1) |
HW 1 due |
Sep 25 | Lasso Regression, Optimization | ISLR 6.2.2-6.2.3 (ESL 3.4.2-3.4.3) |
|
Lab 02 (Sep 26) |
|||
(no class, Truth and Reconciliation) | |||
Oct 2 | Basis Functions | ISLR 7.1, 7.4 (ESL 5.1-5.3) |
|
Lab 03 (Oct 3) |
|||
Oct 7 | Model Selection for Linear Methods | (ESL 7.6-7.7) |
3 Nonparametric Methods, Curse of Dimensionality
Topics: kNN; trees; kernel machines; curse of dimensionality
Learning Objectives:
- Analyze how dimensionality affects the performance of parametric vs nonparametric methods
- Implement nonparametric methods (kNN, kernel smoothing, kernel machines) and analyze their properties
- Write the parametric version of nonparametric methods (e.g. kernel ridge regression) and vice versa
Date | Topic | Readings | Deadlines |
---|---|---|---|
Oct 9 | kNN, parametric vs non-parametric | ISLR 3.5 (ESL 2.3.2, 5.4.1) |
HW 2 due |
Lab 04 (Oct 10) |
|||
Oct 14 | Approximate kNN, trees | ISLR 8.1 (ESL 9.2) |
|
Oct 16 | Kernel Machines | ||
Lab 05 (Oct 17) |
|||
Oct 21 | Curse of Dimensionality, Review | ISLR 8.1 (ESL 9.2) |
Midterm Exam
Date | Topic |
---|---|
Oct 23 | MIDTERM EXAM (In Class) |
- In person attendance is required (per Faculty of Science guidelines)
- You must bring your computer as the exam will be given through Canvas
- Please arrange to borrow one from the library if you do not have your own. Let me know ASAP if this may pose a problem.
- You may bring 2 sheets of front/back 8.5 × 11 inch paper with handwritten notes you want to use. No other materials will be allowed.
- There will be no required coding, but I may show code or output and ask questions about it.
- It will be entirely multiple choice / True-False / matching, etc. Delivered on Canvas.
4 Unsupervised Learning, Generative Modelling
Topics: dimension reduction and clustering; generative vs discriminative modelling
Learning Objectives:
- Differentiate between generative and discriminative modelling approaches and identify when each is most appropriate
- Implement dimensionality reduction techniques (PCA, kernel PCA) and analyze their impact on data representation
- Apply clustering algorithms (k-means, Gaussian mixture models) and evaluate their performance using appropriate metrics
- Connect unsupervised learning methods to their generative/discriminative modelling framework
Date | Topic | Readings | Deadlines |
---|---|---|---|
Oct 28 | Generative vs Discriminative Modelling | ISLR 4.2.0, 12.1 | |
Oct 30 | Dimensionality Reduction | ISLR 12.2 (ESL 14.5.1, 14.5.4) |
|
Lab 06 (Oct 31) |
|||
Nov 04 | Clustering 1 | ISLR 12.4.1 (ESL 14.3) |
|
Nov 6 | Clustering 2 | HW 3 due | |
Lab 07 (Nov 7) |
5 Ensembles, Black-Box Methods
Topics: ensembles; bootstrap; bagging; boosting; random forests
Learning Objectives:
- Implement bootstrap and ensembling methods, reason through computational tradeoffs
- Differentiate ensemble methods that reduce bias or variance
- Utilize “hidden advantages” of ensembles around feature importance, uncertainty quantification, etc.
- Identify assumptions in black-box methods of uncertainty quantification, variance reduction, and bias reduction
Date | Topic | Readings | Deadlines |
---|---|---|---|
(no class, Midterm Break) | |||
Nov 13 | The Bootstrap | ISLR 5.2 (ESL 7.11, 8.2) |
|
Nov 18 | Bagging and Random Forests | ISLR 8.2.0-8.2.2 (ESL 8.7, 15.1-15.3) |
|
Nov 20 | Boosting | ISLR 8.2.3 (ESL 10.1-10.5, 10.9) |
|
Lab 08 (Nov 21) |
6 Deep Learning
Topics: neural networks; deep learning architectures; generative AI
Learning Objectives:
- Construct a basic neural network architecture from simple mathematical building blocks
- Articulate the effects of depth and width on the representational capacity and generalization of neural networks
- Connect neural networks to other methods covered in the course (basis functions, kernel methods, boosting methods)
- Derive the backpropagation algorithm
- Evaluate modern neural network architectures for different problem types
Date | Topic | Readings | Deadlines |
---|---|---|---|
Nov 25 | Introduction to Neural Networks | ISLR 10.1-10.2 (ESL 11.1, 11.3) |
|
Nov 27 | Neural Network Optimization Generalization |
ISLR 10.7-10.8 (ESL 11.4) |
HW 4 due |
Lab 09 (Nov 28) |
|||
Dec 2 | Neural Net Architectures Generative AI |
||
Dec 4 | Review |
Final Exam
Do not make any plans to leave Vancouver before the final exam date is announced.
- In person attendance is required (per Faculty of Science guidelines)
- You may bring 2 sheets of front/back 8.5 × 11 inch paper with handwritten notes you want to use. No other materials will be allowed.