You are here

Mathematical Control Theory: An Introduction

Jerzy Zabczyk
Publication Date: 
Number of Pages: 
Modern Birkhäuser Classics
[Reviewed by
Fernando Q. Gouvêa
, on

This introduction to Mathematical Control Theory was first published by Birkhäuser in 1992, then reprinted with corrections in 1995. It has now been reprinted in the Modern Birkhäuser Classics series. A Telegraphic Review appeared in the Monthly in 1993:

Relatively self-contained introduction to control theory: finite dimensional systems, nonlinear systems, optimal control (dynamic programming, maximal principle approaches), infinite dimensional systems. Pervasive issues include controllability, observability, stability, realization.

One of the quotes on the back cover comes from the inimitable Gian-Carlo Rota: "At last! We did need an introductory textbook on control which can be read, understood, and enjoyed by everyone." The review in the Bulletin of the AMS (31 (1994), 330–332), by Raymond Rishel, concluded thus:

To get so much material in such a short space, the pace of the presentation is brisk. However, the exposition is excellent, and the book is a joy to read. A novel one-semester course covering both linear and nonlinear systems could be given using Chapters 0, 1, and 2 of Part I and Chapters 1 and 2 of Part II. If time permitted, Chapter 3 of Part I and Chapter 3 of Part II on realizations of respectively linear and nonlinear systems could be added. The book is an excellent one for introducing a mathematician to control theory. The book presents a large amount of material very well, and its use is highly recommended.

This is a worthy reprint of a worthy book.


Preface.- Introduction.- Part I. Elements of classical control theory.- Controllability and observability.- Stability and stabilizability.- Realization theory.- Systems with constraints.- Part II. Nonlinear control systems.- Controllability and observability of nonlinear systems.- Stability and stabilizability.- Realization theory.- Part III. Optimal control.- Dynamic programming.- Dynamic programming for impulse control.- The maximum principle.- The existence of optimal strategies.- Part IV. Infinite dimensional linear systems.- Linear control systems.- Controllability.- Stability and stabilizability.- Linear regulators in Hilbert spaces.- Appendix.- Metric spaces.- Banach spaces.- Hilbert spaces.- Bochner's integral.- Spaces of continuous functions.- Spaces of measurable functions.- References.- Notations.- Index