You are here

Optimal Control Theory with Applications in Economics

MIT Press
Number of Pages: 

This is an economics book that includes a lot of rigorous mathematics and is slanted toward those economic problems that can be modeled by differential equations. The core narrative takes up only about 150 of the book’s 350 pages, with another 100 pages devoted to expounding the necessary mathematical background and 100 pages of detailed solutions to the exercises. The book is loaded with realistic examples, for which the mathematical models are explained carefully.

After the mathematical background is developed, the book starts off with optimization problems for differential equations. The processes are modeled by a differential equation that depends on some non-constant parameters that are controllable by the decision maker, and one investigates solutions that maximize an objective function. In general these equations cannot be solved explicitly, and the emphasis is on characterizing the optimal solutions. The main tools are Pontryagin’s Maximal Principle and the ideas of dynamic programming developed by Richard Bellman.

The next major topic is game theory, with an emphasis on the Nash equilibrium, probabilistic problems, and differential games; this builds on the theory developed in the first part. The final topic is brief and deals with mechanism design, which includes such things as the design of markets; it is a form of game theory where one player gets to make the rules, but still needs to influence the other players in order to reach his goals.

Allen Stenger is a math hobbyist and retired software developer. He is webmaster and newsletter editor for the MAA Southwestern Section and is an editor of the Missouri Journal of Mathematical Sciences. His mathematical interests are number theory and classical analysis. He volunteers in his spare time at, a math help site that fosters inquiry learning.

Date Received: 
Thursday, November 17, 2011
Include In BLL Rating: 
Thomas A. Weber
Publication Date: 
Allen Stenger
  • Foreword by A. V. Kryazhimskiy
  • Acknowledgments
  1. Introduction
    • 1.1 Outline
    • 1.2 Prerequisites
    • 1.3 A Brief History of Optimal Control
    • 1.4 Notes
  2. Ordinary Differential Equations
    • 2.1 Overview
    • 2.2 First-Order ODEs
    • 2.3 Higher-Order ODEs and Solution Techniques
    • 2.4 Notes
    • 2.5 Exercises
  3. Optimal Control Theory
    • 3.1 Overview
    • 3.2 Control Systems
    • 3.3 Optimal Control—A Motivating Example
    • 3.4 Finite-Horizon Optimal Control
    • 3.5 Infinite-Horizon Optimal Control
    • 3.6 Supplement 1: A Proof of the Pontryagin Maximum Principle
    • 3.7 Supplement 2: The Filippov Existence Theorem
    • 3.8 Notes
    • 3.9 Exercises
  4. Game Theory
    • 4.1 Overview
    • 4.2 Fundamental Concepts
    • 4.3 Differential Games
    • 4.4 Notes
    • 4.5 Exercises
  5. Mechanism Design
    • 5.1 Motivation
    • 5.2 A Model with Two Types
    • 5.3 The Screening Problem
    • 5.4 Nonlinear Pricing
    • 5.5 Notes
    • 5.6 Exercises
  • Appendix A: Mathematical Review
    • A.1 Algebra
    • A.2 Normed Vector Spaces
    • A.3 Analysis
    • A.4 Optimization
    • A.5 Notes
  • Appendix B: Solutions to Exercises
    • B.1 Numerical Methods
    • B.2 Ordinary Differential Equations
    • B.3 Optimal Control Theory
    • B.4 Game Theory
    • B.5 Mechanism Design
  • Appendix C: Intellectual Heritage
  • References
  • Index
Publish Book: 
Modify Date: 
Thursday, September 6, 2012