Lecture: Optimization Methods for Machine Learning
News

The lecture (LSF) will be taught in English and addresses Master and PhD students in Mathematics or related fields. We will discuss the format in our first meeting. Slides of the lectures and videos of lectures are available.

Information
  • Lecture with 4+2 SWS and 9 ECTS-Credits
  • Lecturer practical exercises
Content

It will survey optimization problems in and methods for machine learning. Table of contents:

  • Introduction to Machine Learning
  • Machine Learning Case Studies: text classification and perceptual tasks
  • Problem Definitions and Methods Overview
  • Efficient Calculation of Derivatives
  • Problem reformulations
  • Stochastic Gradient Methods
  • Noise Reduction Methods
  • Second-Order Methods
  • Other Popular Methods
  • Ethical, Philosophical, and Economical Aspects of Artifical Intelligence

Practical exercises will complement the lecture and focus on KERAS and Python scikit-learn.

Requirements

Mathematical basics (Analysis and Linear Algebra) and programming skills. Introduction to Optimization. The lecture Nonlinear Optimization is highly recommended, but not absolutely necessary.

Module description
The lecture is a master lecture in the mathematics curriculum and described in the module handbook (currently page 31) as a Wahlpflicht module:
  • WPF MA (Module 12, 13, 14)
  • WPF MA;M 1-3 (Module M3D)

A translation of the module description:
  • Goals and competences: The students acquire competences with respect to modeling and algorithmically solving optimization problems that are at the basis of modern machine learning techniques. A rigorous mathematical analysis of convergence theory and implementation aspects of different algorithms is the guiding theme of the lecture. In the exercises the students learn how to implement algorithms efficiently on a computer and to apply them to concrete problem instances.
  • Content: An introduction to mathematically formulating machine learning problems in a generalized way, calculating derivatives, stochastic and deterministic derivative-based algorithms, convergence theory. See above for a table of contents.

The lecture is also open to other master and PhD students of OVGU. In particular, there is an agreement that ORBA students may choose the lecture as a Wahlpflicht (with 10 CP to motivate the independent study of mathematical foundations necessary to follow the lecture). However, please note that the lecture is addressed to mathematical master students and assumes a good understanding of mathematical basics, especially in the second part of the lecture. If you are mainly interested in applying machine learning and not so much in analyzing the training process, other lectures might be better suited for you. Note that the lecture Concepts and Algorithms of Optimization is not sufficient as a requirement, you will have to invest more time to acquire additional mathematical knowledge.

Material: mathematical background
Material: machine learning
Material: optimization and machine learning
Material: AI and the future of mankind
Material: hands on
Questions?

Feel free to send me an email with general questions:

Prof. Dr. rer.nat. habil. Sebastian Sager
Head of MathOpt group
at the Institute of Mathematical Optimization
at the Faculty of Mathematics
at the Otto von Guericke University Magdeburg

Universitätsplatz 2, G02-224
39106 Magdeburg, Germany

: +49 391 67 58745
: +49 391 67 11171
:

Susanne Heß

Universitätsplatz 2, G02-206
39106 Magdeburg, Germany

: +49 391 67-58756
: +49 391 67-11171
:

Prof. Dr. rer.nat. habil. Sebastian Sager
Head of MathOpt group
at the Institute of Mathematical Optimization
at the Faculty of Mathematics
at the Otto von Guericke University Magdeburg

Universitätsplatz 2, G02-224
39106 Magdeburg, Germany

: +49 391 67 58745
: +49 391 67 11171
:

Susanne Heß

Universitätsplatz 2, G02-206
39106 Magdeburg, Germany

: +49 391 67-58756
: +49 391 67-11171
: