Learning theory
Weekly outline
-
This master class on learning theory covers the classical PAC framework for learning, stochastic gradient descent (together with recent research papers), tensor methods and/or graphical model learning.
Teachers: Ruediger Urbanke: ruediger.urbanke@epfl.ch and Nicolas Macris: nicolas.macris@epfl.ch
Teaching Assitant: Farzad Pourkamali: farzad.pourkamali@epfl.ch
Courses: Mondays 8h15-10h in presence Room INM202; Exercises: Tuesdays 17h15-19h in presence Room INR219.
We will use this moodle page to distribute homeworks, solutions and also collect graded ones. As well as use the discussion and questions forum. Dont hesitate to actively use this forum.
Lectures are in presence. If you miss a lecture an old recorded version is accessible here https://mediaspace.epfl.ch/channel/CS-526+Learning+theory/29761 however the material or order of lectures might be slightly different this year
EXAM: its open book. You can bring your notes, printed material, book(s). But no electronic material! Date and location: monday 19 June 15h15-18h15 in room INJ 218.
-
If you have a question or want to start a discussion on a topic, post here
-
Chapters 3 and 4 in UML
Homework 1: exercises 1, 3, 7, 8 of Chapter 3.
-
Chapter 5 in UML
Homework 2: exercises 1 and 2 of chapter 4
-
Chapter 6 in UML
Graded hmw 3 due date Monday 20 March 23h59
-
Chapter 6 continued
graded hmw 3 continued due date monday 20 March 23h59
-
Chapter 7 in UML
Homework 4: exercise 3 of chapter 6 and exercise 3 of chapter 7.
-
Chapter 14 in UML: Gradient descent (convexity, Lipshitzness, Approach to optimal solution)
-
Chapter 14 continued: Stochastic gradient descent, application to learning
Mean field approach for two layer neural networks
Graded hmw 6 due date tuesday 18th April at 23h59
-
Easter week break
-
Mean field approach for two layer neural networks continued
graded hmw 6 continued (due tuesday 18th April at 23h59)
-
Tensors 1. Motivations and examples, multi-dimensional arrays, tensor product, tensor rank.
-
Tensors 2. Tensor decompositions and rank, Jennrich's theorem
-
Tensors 3. Matricizations and Alternating Least Squares algorithm
-
Tensors 4. Multilinear rank Tucker higher order singular value decomposition
-
Tensors 5. Power method and Applications: Gaussian Mixture Models, Topic models of documents
-
Monday 29 holiday
Tuesday 30: exercise session - Q&A.
Reminder hmw 10 is the fourth graded and the deadline is Friday June 2nd. Upload below.
The hmw 11 below is an extra hmw that reviews the tensor whitening process
-
Here are old exams with solutions. Note that most, if not all, problems have been already included in this year's material. In exam 2019 ignore problems on graphical models that we did not treat this year.
-