MVA Course - Introduction to Statistical Learning

2019/2020

***Instructors:
Prof.
Nicolas VAYATIS
Teaching assistant: Xavier FONTAINE

***Emails:
<name/-at-/cmla.ens-cachan.fr>

***Syllabus 2019-2020

***Course schedule and location:

Room Condorcet, Building d'Alembert, ENS Paris-Saclay (Cachan campus - 61, avenue du Président Wilson - 94230 Cachan)

Date Time Instructor Session Topics Files/Readings
Tuesday October 1
11:00-13:00 N. Vayatis Lecture #1 Course introduction and main setup
Chapter 1 - Optimality in statistical learning
Slides
Tuesday October 8
11:00-13:00 N. Vayatis Lecture #2 Chapter 1 - Optimality in statistical learning

Slides
Tuesday October 15
11:00-13:00 X. Fontaine Exercise session #1 Optimal elements and excess risk bounds

Sheet
Tuesday October 22
11:00-13:00 N. Vayatis Lecture #3 Chapter 2 - Probabilistic inequalities, complexity measures

Slides
Tuesday October 29
11:00-13:00 X. Fontaine Exercise session #2 Inequalities, Rademacher complexity, VC dimension

Sheet
Tuesday November 5
11:00-13:00
Partial exam - Mandatory
No documents allowed


Tuesday November 12
11:00-13:00 N. Vayatis Lecture #4 Chapter 2 - Regularization, stability

Slides
Tuesday November 19
11:00-13:00 N. Vayatis Lecture #5 Chapter 3 - Consistency of Machine Learning methods (I)
Boosting
Slides
Tuesday November 26 11:00-13:00 X. Fontaine Exercise session #3 Consistency and convergence bounds

Sheet
Tuesday December 3
11:00-13:00 N. Vayatis Lecture #6 Chapter 3 - Consistency of Machine Learning methods (II)
SVM, Feedforward NN
Slides
Tuesday December 10 11:00-13:00 X. Fontaine Exercise session #4 Course wrap-up

Sheet
Wednesday January 8 13:30-15:30
Final exam
Documents allowed
Rooms C406 and C415 - Cournot building 4th floor


***Office hours:
Tuesdays 1pm/2pm
- Laplace 126

***Evaluation:
Partial exam on November  5 (am) - MANDATORY
Final exam on January 8 (1:30pm-3:30pm) - MANDATORY

***Grading:
Course grade = max(final ; (partial+final)/2)

 *************

Contents

Chapter 1 - Optimality in statistical learning:
    From information theory to statistical learning
    The probabilistic view on numerical data,
   
Optimality and the bias-variance dilemma
Chapter 2 - The mathematical foundations of statistical learning:
    Risk minimization
    Concentration inequalities
    Complexity measures
    Regularization
    (Stability)
Chapter 3 - Theory: Consistency theorems and error bounds of learning algorithms

    Part 1 - SVM, Neural Networks, Boosting
    Part 2 - Bagging, Random Forests
 
(Chapter 4 - Advanced topics)

Some lecture notes

Unofficial (and confidential!) lecture notes available here:  Link (password required)
Disclaimer: this document was offered by a former student, some mistakes may be found in the document.


Past exams

Final exam 2017 / Partial exam 2017  / Final exam 2016 / Partial exam 2016 / Final exam 2015 / Partial exam 2015 /


References