Lecture

Convex Analysis and Optimization

Winter Term 2015 / 2016

Convex Analysis and Optimization

Dr. Peter Ochs

Winter Term 2015 / 2016
Lecture (2h) with exercises (2h)
6 credit points

Lecture: Thursday 10-12 c.t., Building E1.3, Lecture Hall 003
First lecture: Thursday, October 22, 2015.

Tutorial: Tuesday 16-18 c.t., Building E1.3, Lecture Hall 003
First tutorial: Tuesday, October 27, 2015.



AnnouncementsDescriptionTutorialsRegistrationExamsLectures ReferencesSoftware



06/04/2016: Time schedule for second oral exam online.
01/02/2016: Time schedule for oral exam online.
25/01/2016: Exam-registration reminder sent to qualified students per eMail.
14/01/2016: Convergence Theorem for Subgradient Method clarified on Slides.
08/12/2015: Exercise Sheet T5 updated.
08/12/2015: There is no counterexample for Exercise 3(a) on sheet T5.
08/12/2015: Part 4 of the lecture notes are online.
08/12/2015: Proof of Proposition 6.9 is more explicit than presented.
24/11/2015: Date for the second exam corrected.
19/11/2015: Information about the exams is available.
10/11/2015: First part of the lecture notes is available.
29/10/2015: Submission deadline for Exercise Sheet 1 extended to 12/11/2015.
05/10/2015: Website is online.


Many problems in image processing, computer vision, and machine learning can be formulated as convex optimization problems and can be solved efficiently. The developement of fast optimization algorithms relies on the knowledge of convex analysis. In this lecture, the basics of convex analysis are introduced, where we will attach importance to the geometric interpretation. Moreover, the connection between theory and applications will be explored in programming exercises from image processing and machine learning.

Prerequisites: Basic mathematics (such as Mathematik für Informatiker I-III, or calculus and linear algebra). Understanding English is necessary.


The tutorials include homework assignments. Homework assignments are handed in and graded. To qualify for the final exam, you need 2/3 of all possible points. Working together in groups of up to 3 people is permitted and highly encouraged.

If you have questions concerning the tutorials, please do not hesitate to contact Peter Ochs.


In order to register for the Lecture, write an e-mail to Peter Ochs. The subject line must begin with the tag [CAO15]. Please use the following template for the e-mail:

First name: [myFirstName]
Last name: [myLastName]
Date of birth: [dd.mm.yyyy]
Student ID number: [...]
Course of study: [Bachelor/Master/...]
Subject: [Computer science/Mathematics/...]

Note that the e-mail address from which you send this information will be used to provide you with urgent information concerning the lecture.

This registration is for internal purposes at our chair only and completely independent of any System like LSF/Hispos. They require a separate registration.


First exam: 11. February 2016
Second exam: 14. April 2016

  • You can attend both exams.
  • Each exam counts as one try.
  • Second exam can be taken to improve the grade.

  • Exams can be taken in English (default) or German.

Registration:

  • HISPOS (currently only for the first exam; second will be available later)
  • eMail to Peter Ochs what date/exam you would like to take
    (Deadline first exam: 31. January 2016)
    (Deadline second exam: 03. April 2016)
    Subject: [CAO15] exams
  • If your preferred language for the exam is German, please write a comment, otherwise it will be in English.
  • I will arrange the time slots and let you know.

Time schedule for the second exam: (14. April)

Time slot Name
8.30 – 9.00 Janis Kalofolias
9.00 – 9.30 Stalin Varanasi
9.30 – 10.00 Sreenivas Narasimha Murali
10.00 – 10.30 Debjit Paul
10.30 – 11.00 Atanas Poibrenski


Participants of the course can download the lecture materials here after the lecture (access is password-protected). However, be aware that these slides are only provided to support the classroom teaching, not to replace it. Additional organisational information, such as examples and explanations that may be helpful or necessary to understand the content of the course (and thus relevant for the exam), will be provided in the lectures. It is solely your responsibility - not ours - to make sure that you receive this infomation.

The topics given here are preliminary and might change.

Date. Title Assignments
22/10 Introduction and basic concepts of complexity
29/10 Terminology of variational analysis [T01][P01]
05/11 Lipschitz continuity and the gradient descent method
12/11 Convex sets and the Projection Theorem [T02][P02]
19/11 Convex functions [T03][P03]
26/11 Moreau envelope and Proximal Point Algorithm [T04][P04]
03/12 The subdifferential and Fermat's rule [T05]
10/12 Lower complexity bounds [T06]
17/12 The gradient method for smooth optimization
07/01 Optimal gradient methods for smooth optimization [T07]
14/01 Nonsmooth convex optimization [T08][P08]
21/01 Geometric introduction to convex duality [T09]
28/01 Convex duality II
04/02 Summary and outlook


Lecture Notes
Notes 01
Notes 02
Notes 03
Notes 04
Notes 05
Notes 06
Notes 07
Notes 08


There is no specific book that covers the complete content of this class.

  • T. Rockafellar: Convex Analysis. Princeton University Press, 1970.
  • Y. Nesterov: Introductory Lectures on Convex Optimization - A Basic Course. Kluwer Academic Publishers, 2004.
  • D. P. Bertsekas: Convex Analysis and Optimization. Athena Scientific, 2003.
  • S. Boyd: Convex Optimization. Cambridge Univeristy Press, 2004.
  • H. H. Bauschke and P. L. Combettes: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, 2011.

Further references will be given during the lecture.


For the programming exercise, we use Matlab.

  • Matlab is accessible via our campus license. Details how to use it can be found here.
  • Access from outside should be possible via ssh:
    ssh -X username@computername.studcs.uni-sb.de
  • Material for Matlab:



MIA Group
©2001-2023
The author is not
responsible for
the content of
external pages.

Imprint - Data protection