This book is an advanced graduate level text on the mathematical theory of optimization. The author primarily considers problems involving the minimization of smooth functions on **R**^{n}, subject to a finite number of constraints that are expressed in terms of inequalities involving smooth constraint functions. In a few places the author goes beyond this to consider optimization problems on Banach spaces or with an infinite number of constraints. The main results of the theory of these problems concern necessary and sufficient conditions for a solution to be optimal.

The theory of optimization depends critically on background in analysis. The author begins the book with a chapter reviewing Gâteaux and Fréchet differentiability and Taylor’s theorem. There is also a chapter on the variational principles of Ekeland and Borwein-Preiss and the application of these principles to proving theorems in analysis. There are several chapters on convexity, including basic concepts, separation theorems, and the theory of polyhedra and convex cones.

Building on this mathematical foundation, the author goes on to develop necessary and sufficient conditions for the optimality of various classes of optimization problems. Optimality conditions for unconstrained optimization problems are the simplest case and the author discusses these immediately after the presentation of background material on differentiability. After introducing variational principles and convexity, the author goes on to give optimality conditions for linear programming problems. The author presents the Fritz John first order necessary conditions and the Karush-Kuhn-Tucker second order necessary and sufficient conditions for the optimality of general nonlinear programming problems. An important technical hypothesis called constraint qualification is required in the second order conditions. The book discusses the Magnasarian-Fromovitz and Slater constraint qualifications. A separate chapter discusses optimality conditions for semiinfinite programming problems (problems with an infinite number of constraints.)

Duality is an important topic in the theory of optimization. The author has chosen to present duality theory in terms of the Lagrangian dual function rather than using Fenchel duality. The author proves the standard results in duality theory. The weak duality theorem asserts that the optimal value of the dual is a lower bound on the optimal value of the primal problem. For convex optimization problems that satisfy Slater’s condition, the strong duality theorem asserts that the primal and dual optimal values are equal. Examples of duality in linear programming, nonlinear programming, and conic optimization are presented.

The author concludes the book with a very brief chapter on algorithms, including gradient projection methods, Newton’s method, and the conjugate gradient method. The focus of this chapter is on the theory of convergence and convergence rates for these methods rather than the practical application of the methods.

The book is filled with many useful examples and counterexamples that provide intuition to support the more formal theorems and proofs. There are also numerous exercises for the student. This textbook would be suitable for use in an advanced graduate course on the theory of optimization for students with sufficient background in analysis and linear algebra. The book will also be useful as a reference for researchers working in various areas of optimization.

Brian Borchers is a professor of Mathematics at the New Mexico Institute of Mining and Technology. His interests are in optimization and applications of optimization in parameter estimation and inverse problems.