2020 Faculty Courses School of Computing Department of Mathematical and Computing Science Graduate major in Mathematical and Computing Science
Mathematical Optimization: Theory and Algorithms
- Academic unit or major
- Graduate major in Mathematical and Computing Science
- Instructor(s)
- Mituhiro Fukuda / Makoto Yamashita
- Class Format
- Lecture (Zoom)
- Media-enhanced courses
- -
- Day of week/Period
(Classrooms) - 3-4 Mon (W834) / 3-4 Thu (W834)
- Class
- -
- Course Code
- MCS.T402
- Number of credits
- 200
- Course offered
- 2020
- Offered quarter
- 2Q
- Syllabus updated
- Jul 10, 2025
- Language
- English
Syllabus
Course overview and goals
This course will cover basic notions to comprehend the gradient-based methods for convex optimization problems considered in mathematical optimization, machine learning and image processing. It starts with the basics, from the definition of convex sets in convex optimization, and will gradually focus on continuously differentiable convex functions. Along the lectures, it will also cover the characterization of solutions of optimization problems (optimality conditions), and numerical methods for general problems such as steepest descent methods, Newton method, conjugate gradient methods, and quasi-Newton methods. In the latter part, the accelerated gradient method of Nesterov for Lipschitz continuous differentiable convex functions will be detailed.
Course description and aims
Objectives: Learn the mathematical concepts and notions from the basics necessary for numerical methods for convex optimization problems. Definitions and proofs of theorems will be carefully explained. The objective is to understand the role of basic theorems of convex optimization in scientific articles, and to be prepared to apply them for other problems in mathematical optimization and machine learning.
Theme: In the first part, important theorems to analyze convex optimization problems will be introduced. In the second part, the Nesterov's accelerated gradient method which has received a lot of attention in the recent years will be explained from the mathematical point of view.
Keywords
Convex function, algorithm analysis, convex optimization problem, numerical methods in optimization, differentiable convex functions with Lipschitz continuous gradients, accelerated gradient method
Competencies
- Specialist skills
- Intercultural skills
- Communication skills
- Critical thinking skills
- Practical and/or problem-solving skills
Class flow
All the lectures will have proofs of theorem and explanations of concepts behind a method or definition.
Course schedule/Objectives
Course schedule | Objectives | |
---|---|---|
Class 1 | Convex sets and related results | Criterion to grade will be explained |
Class 2 | Lipschitz continuous differentiable functions | |
Class 3 | Optimal conditions for differentiable functions | |
Class 4 | Minimization algorithms for unconstrained optimization problems, steepest descent method | |
Class 5 | Newton method, conjugate gradient methods, quasi-Newton methods | |
Class 6 | Convex differentiable function | |
Class 7 | General assignment to check the comprehension | |
Class 8 | Differentiable convex functions with Lipschitz continuous gradients | |
Class 9 | Worse case analysis for gradient based methods | |
Class 10 | Steepest descent method for differentiable convex functions | |
Class 11 | Estimate sequence in accelerated gradient methods for differentiable convex functions | |
Class 12 | Accelerated gradient method for differentiable convex functions | |
Class 13 | Accelerated gradient methods for min-max problems | |
Class 14 | Extensions of the accelerated gradient methods |
Study advice (preparation and review)
To enhance effective learning, students are encouraged to spend approximately 100 minutes preparing for class and another 100 minutes reviewing class content afterwards (including assignments) for each class.
They should do so by referring to textbooks and other course material.
Textbook(s)
None.
Reference books, course materials, etc.
D. P. Bertsekas, Nonlinear Programming, 2nd edition, (Athena Scientific, Belmont, Massachusetts, 2003).
Y. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, (Kluwer Academic Publishers, Boston, 2004).
Y. Nesterov, Lectures on Convex Optimization, 2nd edition, (Springer, Cham, Switzerland, 2018).
J. Nodedal and S. J. Wright, Numerical Optimization, 2nd edition, (Springer, New York, 2006).
Evaluation methods and criteria
Understand the basic theorems related to convex sets and convex functions, and the basic numerical methods to solve mathematical optimization problems. Grade will be based on mid-term and final exams or on reports along the course.
Related courses
- MCS.T506 : Mathematical Models and Computer Science
- IEE.A430 : Numerical Optimization
Prerequisites
It is necessary that the attendees have basic notions of mathematics such as linear algebra and calculus as well as understand "basic" methodologies to prove.