Lecture: Methods (Truncated Newton Method)

Tim Papandreou - Stanford

 
Previous LectureNext Lecture

Description

Lecture Description

Methods (Truncated Newton Method), Convergence Versus Iterations, Convergence Versus Cumulative CG Steps, Truncated PCG Newton Method, Truncated Newton Interior-Point Methods, Network Rate Control, Dual Rate Control Problem, Primal-Dual Search Direction (BV Section 11.7), Truncated Netwon Primal-Dual Algorithm, Primal And Dual Objective Evolution, Relative Duality Gap Evolution, Relative Duality Gap Evolution (N = 10^6), L_1-Norm Methods For Convex-Cardinality Problems, L_1-Norm Heuristics For Cardinality Problems, Cardinality, General Convex-Cardinality Problems, Solving Convex-Cardinality Problems, Boolean LP As Convex-Cardinality Problem, Sparse Design, Sparse Modeling / Regressor Selection, Estimation With Outliers, Minimum Number Of Violations, Linear Classifier With Fewest Errors, Smallest Set Of Mutually Infeasible Inequalities, Portfolio Investment With Linear And Fixed Costs, Piecewise Constant Fitting, Piecewise Linear Fitting, L_1-Norm Heuristic, Example: Minimum Cardinality Problem, Polishing, Regressor Selection

Course Description

Continuation of Convex Optimization I.

Topics include: Subgradient, cutting-plane, and ellipsoid methods. Decentralized convex optimization via primal and dual decomposition. Alternating projections. Exploiting problem structure in implementation. Convex relaxations of hard problems, and global optimization via branch & bound. Robust optimization. Selected applications in areas such as control, circuit design, signal processing, and communications.

from course: Convex Optimization II

Comments

Related Lectures