Lecture: Recap: Conjugate Gradient Method

Tim Papandreou - Stanford

 
Previous LectureNext Lecture

Description

Lecture Description

Recap: Conjugate Gradient Method, Recap: Krylov Subspace, Spectral Analysis Of Krylov Sequence, A Bound On Convergence Rate, Convergence, Residual Convergence, CG Algorithm, Efficient Matrix-Vector Multiply, Shifting, Preconditioned Conjugate Gradient Algorithm, Choice Of Preconditioner, CG Summary, Truncated Newton Method, Approximate Or Inexact Newton Methods, CG Initialization, Hessian And Gradient, Methods, Convergence Versus Iterations, Convergence Versus Cumulative CG Steps, Truncated PCG Newton Method, Extensions

Course Description

Continuation of Convex Optimization I.

Topics include: Subgradient, cutting-plane, and ellipsoid methods. Decentralized convex optimization via primal and dual decomposition. Alternating projections. Exploiting problem structure in implementation. Convex relaxations of hard problems, and global optimization via branch & bound. Robust optimization. Selected applications in areas such as control, circuit design, signal processing, and communications.

from course: Convex Optimization II

Comments

Related Lectures