Laboratory of Computational Neuroscience LCN - Brain Mind - I&C - Life Sciences - EPFL
slides for Introduction - Lecture 0: Classification, Supervised Learning, Simple Perceptron
slides for DeepLearning - Lecture 1: MultiLayer Networks and BackProp
slides for DeepLearning - Lecture 2: Tricks of the Trade
Part 1 - Questions and aims of this lecture (6 min) Part 2 - Loss Landscape: minima and saddle points (14 min) Part 3 - Why are there so many saddle points? (28 min) Part 4 - Momentum (15 min) Part 5 - RMSprop and ADAM (21 min) Part 6 - No Free Lunch theorem (8 min) Part 7 - Deep networks versus shallow networks (9 min)
slides for DeepLearning - Lecture 3: Loss Landscape and Optimization Methods for Deep Learning
Part 1 - The statistical view: generative models (6 min) Part 2 - The likelihood of data under a model (12 min) Part 3A - Statistical interpretation of artificial neural networks: the cross-entropy loss function (19 min) Part 3B - Statistical interpretation of artificial neural networks: can we interpret the output as a probability? (18 min) Part 4 - Sigmoidal units as natural output functions (9 min) Part 5 - Multi-class problems (19 min) Part 6 - Statistical approach: Summary and Quiz (7 min)
slides for DeepLearning - Lecture 4: Stastical Classification by Deep Networks
Part 1 - Inductive bias in machine learning (11 min) Part 2 - Convolution filters as inductive bias for images (16 min) Part 3 - MaxPooling as inductive bias for images (20 min) Part 3* - Quiz (6 min) Part 4 - The gradient of a convolutional layer: (16 min) Part 5 - Automatic differentiation: BackProp revisted (9 min) Part 6 - Reducing the number of parameters: outer-product (8 min) Part 7 - Modern ConvNets and image recognition tasks (17 min) Part 8 - Applications beyond object recognition (9 min)
slides for DeepLearning - Lecture 5: Convolutional Networks