The Many Faces of Degeneracy in Conic Optimization (Foundations and Trends(r) in Optimization)

Dmitriy Drusvyatskiy, Henry Wolkowicz

  • 出版商: Now Publishers Inc
  • 出版日期: 2017-12-19
  • 售價: $2,930
  • 貴賓價: 9.5$2,784
  • 語言: 英文
  • 頁數: 116
  • 裝訂: Paperback
  • ISBN: 1680833901
  • ISBN-13: 9781680833904

下單後立即進貨 (約1週~2週)

商品描述

Slater’s condition – existence of a “strictly feasible solution” – is a common assumption in conic optimization. Without strict feasibility, first-order optimality conditions may be meaningless, the dual problem may yield little information about the primal, and small changes in the data may render the problem infeasible. Hence, failure of strict feasibility can negatively impact off-the-shelf numerical methods, such as primal-dual interior point methods, in particular. New optimization modeling techniques and convex relaxations for hard nonconvex problems have shown that the loss of strict feasibility is a more pronounced phenomenon than has previously been realized.

The Many Faces of Degeneracy in Conic Optimization describes various reasons for the loss of strict feasibility, whether due to poor modeling choices or (more interestingly) rich underlying structure, and discusses ways to cope with it and, in many pronounced cases, how to use it as an advantage. In large part, it emphasizes the facial reduction preprocessing technique due to its mathematical elegance, geometric transparency, and computational potential.

The Many Faces of Degeneracy in Conic Optimization is divided into two parts. Part I presents the necessary theoretical grounding in conic optimization, including basic optimality and duality theory, connections of Slater’s condition to the distance to infeasibility and sensitivity theory, the facial reduction procedure, and the singularity degree. Part II focuses on illustrative examples and applications, including matrix completion problems (semidefinite, low-rank, and Euclidean distance), relaxations of hard combinatorial problems (quadratic assignment and max-cut), and sum of squares relaxations of polynomial optimization problems.