Imagine a time in the distant future when humanity has slid into an extended dark age. Knowledge of mathematics, physics, and all sciences has been lost. Wandering scholars search for shreds of documents from the distant past that might help them reconstruct this knowledge. Then, miraculously, one of them stumbles upon an intact copy of A Complete Guide to the Laws of the Universe. Will this book reveal all of the physical laws of the universe that have been lost? No. But it will provide scholars for centuries to come with intriguing and often frustrating hints of what was known at the start of the 21st century.
This book begins with an assumption of no more than a basic familiarity of arithmetic. Even the existence of irrational numbers is treated in some detail. Its goal is to bring the reader up to the cutting edge of research in mathematical physics, to be able to understand the complexities of the search for the grand unified theory that will simultaneously explain general relativity and quantum mechanics. Eleven hundred pages isn't enough space in which to accomplish this. Calculus gets eighteen pages; complex analysis twelve, though, to be fair, Riemann surfaces, conformal mappings, and the Riemann mapping theorem get a further sixteen. Despite the presence of footnoted exercises, this is not a textbook.
There are four distinct parts to this book. The first consists of metaphysical speculation on the nature of truth and the relationship of mathematics to reality. The second develops all of the mathematics needed to understand modern physics. The third brings the reader up to an explanation of general relativity and quantum field theory. The fourth and most interesting part of the book is an extended discussion of the state of current attempts to find a grand unified theory.
The metaphysical speculation is covered in mercifully brief chapters at the beginning and end of the book. It is the prerogative of a reviewer to focus on whatever nits he wishes to pick. The one that really bothered me is on the first page of the first chapter in which Penrose sets the stage by explaining how science arose from observations of the regularities of the physical universe. He uses as his illustrating example the connection between tides and phases of the moon. In fact, this connection was not made until Newton's Principia of 1687, and it was one of the most controversial parts of the Principia. As I. B. Cohen explains [1, pp. 238-242], tidal forces operate in three classes: diurnal (≈24 hours), semidiurnal (≈12 hours), and those with a period of half a month or more. In addition, the effects of these forces are shaped by the geophysical structure of each basin. Even Newton never succeeded in establishing the exact connection between the moon and tides. The problem was simply too complicated.
Penrose soon shifts to the solid ground of the development of mathematics and physics. The pace is breathless. In less than four hundred pages he has covered non-Euclidean geometries, single- and multi-variable as well as vector calculus, Fourier series, conformal mappings, the Cauchy-Riemann equations, Clifford, Grassman, and Lie algebras together with representation theory, the classical groups, calculus on manifolds, Clifford bundles and projective spaces, the Axiom of Choice, orders of infinity, Turing machines, and Gödel's Incompleteness Theorem. Without pausing, he then begins using these tools to develop modern physics. It is all here, from Minkowskian geometry to quantum field theory.
You can't possibly learn these topics from this book, but Penrose does give very effective encapsulations of the key ideas. The book is particularly effective for someone who already knows something or a lot about these subjects. It enables a glimpse into the mind of Roger Penrose, how does he think about this mathematics and this physics? What are the concepts and understandings that he draws on as he uses these tools to attack the problems of modern physics? Before teaching any of these subjects, it would be worth reading the relevant chapter and reflecting on what Penrose sees as important, where he thinks the emphasis should be placed.
The most interesting, in fact, delightful part of the book starts with chapter 28, page 735, when he begins his assessment of the state of modern physics. Everything else has been prologue to this moment. Penrose has never been shy of controversy, and he launches right in, beginning with his reservations about the inflationary model of the Big Bang. He explains the numerous ontological foundations that have been proposed for quantum field theory and lays out his dissatisfactions with each. He is no convert to super-symmetry and string theory. He describes why he is still unconvinced that this is where we will find answers. In hopes of encouraging more research in unfashionable directions, he goes to considerable lengths to explain the basic ideas behind Ashketar and loop variables as well as twistor theory.
These last chapters, though swirled with controversy, are his strongest. He has thought long and hard about these issues, he understands their complexities, and he is incredibly effective at communicating their essence, though the layperson must be forewarned that he freely draws upon all of the mathematics and physics that he has developed in the preceding chapters. As he himself frequently admits in these pages, he may be completely wrong in his reservations and concerns. What he is demanding is that we keep an open mind and continually seek to broaden our perspectives. It is worth buying this book just for these chapters.
 I. Bernard Cohen, A Guide to Newton's Principia, in The Principia, Mathematical Principles of Natural Philosophy: A New Translation, by Isaac Newton, translated by I. Bernard Cohen and Anne Whitman, University of California Press, 1999.
David M. Bressoud is DeWitt Wallace Professor Mathematics at Macalester College in St. Paul, Minnesota.