MVA Course - Introduction to Statistical Learning

2024/2025

***Instructors:
Prof.
Nicolas VAYATIS
TA : Gaëtan SERRE

***Emails:
<firstname.name/-at-/ens-paris-saclay.fr>

***Course schedule and location:


Date Time Room number Instructor Session Topics Material
Tuesday October 1
08:30-10:30 1Z77 N. Vayatis Lecture #1 Chapter 1 - Optimality in binary classification
Data/Objectives/Optimal elements/ERM
Slides
LDA
Tuesday October 8
08:30-10:30 1Z77 N. Vayatis Lecture #2 Chapter 1 - Optimality in statistical learning
Other problems/Complexity of learning
Slides
Tuesday October 15
08:30-10:30 1Z77 G. Serré Exercise session #1 Optimal elements

Set
Tuesday October 22
08:30-10:30 1Z77 N. Vayatis Lecture #3 Chapter 2 - Mathematical foundations (I)
Probabilistic inequalities, complexity measures
Slides
Tuesday October 29
08:30-10:30 1Z77 G. Serré Exercise session #2 Inequalities, Rademacher complexity, VC dimension

Set
Tuesday November 5
08:30-10:30 1Z77
Partial exam - Mandatory
No documents


Tuesday November 12
08:30-10:30 1Z77 N. Vayatis Lecture #4 Chapter 2 - Mathematical foundations (II)
Regularization and stability
Slides
Link
Tuesday November 19
08:30-10:30 1Z77 N. Vayatis Lecture #5 Chapter 3 - Consistency of Machine Learning methods (I)
Margin bounds and application to SVM
Slides
Tuesday November 26 08:30-10:30 1Z77 G. Serré Lecture #6 Chapter 3 - Consistency of Machine Learning methods (II)
Ensemble methods: Bagging, Random Forests, Boosting
Slides
Tuesday December 3
08:30-10:30 1Z77 N. Vayatis Exercise session #3 Consistency and convergence bounds

Set
Tuesday December 10 08:30-10:30 1Z77 N. Vayatis Lecture #7 Chapter 3 - Consistency of Machine Learning methods (III)
Neural networks, Mirror Descent
Slides
Tuesday December 17 08:30-10:30 1Z77 G. Serré Exercise session #4 Course wrap-up

Set
Tuesday January 7
08:30-10:30 1Z77
Final exam - Mandatory
No documents




***Office hours (on demand):
Tuesdays 10:30am/11:30am


***Evaluation:
Partial exam on November 5 (08:30am-10:30am) - MANDATORY - No documents
Final exam on January 7 (08:30am-10:30am) - MANDATORY - No documents

***Grading:
Course grade = max(final ; (partial+final)/2)

 *************

Contents

Chapter 1 - Optimality in statistical learning:
    From information theory to statistical learning
    The probabilistic view on numerical data,
   
Optimality and the bias-variance dilemma
Chapter 2 - The mathematical foundations of statistical learning:
    Risk minimization
    Concentration inequalities
    Complexity measures
    Regularization and Stability
Chapter 3 - Theory: Consistency theorems and error bounds of learning algorithms

    Part 1 - SVM
    Part 2 - Ensemble methods: Bagging, Random Forests, Boosting
    Part 3 - Neural Networks, Mirror Descent

Past exams


/ Final exam 2020 / Partial exam 2020 / Final exam 2019 / Partial exam 2019 / Final exam 2018 / Partial exam 2018  / Final exam 2017 / Partial exam 2017


References