During the past decade Machine Learning, which is the science of getting computers to act without being explicitly programmed, has experienced huge progresses in the development of methods, models and practices. Machine Learning, which has given us practical speech recognition, effective web search, Netflix and Amazon recommendations, … is so pervasive today that certainly everyone is using it many times without knowing it. In this course you will learn about the main used Machine Learning techniques. Topics include : gradient descent, logistic regression model, support vector machines and unsupervised learning (clustering, feature engineering).
Lecture Slides and Labs (2019)
- Introduction (slides)
- Gradient Descent (slides , lab)
- Logistic Regression (slides, lab)
- Support Vector Machines (slides, lab)
- Unsupervised learning (slides, lab)
- Neural Networks (slides, lab)
Lecture Slides (former editions)
Ressources
Videos from « 3Blue1Brown »
- Chapter 1. But what is a Neural Network?
- Chapter 2. Gradient descent, how neural networks learn.
- Chapter 3. What is backpropagation really doing?
- Chapter 4. Backpropagation calculus.
Martin Görner (Google)
Books (theory)
- Shai Shalev-Shwartz, Shai Ben-David, Understanding Machine Learning from Theory to Algorithms. Cambridge University Press.
- Trevor Hastie, Robert Tibshirani, Jerome Friedman. The Elements of Statistical Learning. Data Mining, Inference, and Prediction. Springer.
- Gareth James, Daniela Witten, Trevor Hastie and Robert Tibshirani, An Introduction to Statistical Learning, Springer.
- Ian Goodfellow Yoshua Bengio Aaron Courville, Deep Learning, MIT Press.
Specialized Books
- Andrew Barto and Richard S. Sutton, Introduction to Reinforcement Learning, MIT Press.