The aim of this series of Lectures is to provide the basic background for dealing with Optimization issues in deterministic and stochastic environment. More specifically, we address the main features of smooth optimization algorithms with and without constraints: in addition to the theoretical material, we describe deterministic and stochastic gradient algorithms, Newton-type algorithms, least square algorithms. This part will be completed by an introduction to nonsmooth optimization algorithms (sub gradient algorithms and proximal algorithms). All these optimization algorithms will be implemented during practice classes with application to image processing. The second part of the Lectures will be devoted to actual Statistical issues related to Machine Learning.
To be able to choose and implement a suitable algorithm for solving a given optimization problem, especially in the context of machine learning.
- Optimality conditions
- Algorithms for differentiable optimization without constraints
- Gradient algorithms
- Stochastic gradient algorithm
- Newton-type algorithms
- Least squares issues
- First algorithms for nondifferentiable optimization
- LASSO, proximal algorithm.
- Introduction to Statistical Learning: Ridge Regression, Lasso, Support Vector Machines
- Imaging applications: image registration, compressive sampling, dictionary training
- Project in autonomy.
Differential calculus, basics in statistics.