This book is different.
Most of the time, when chapter one of a five-chapter book begins with a problem as simple as the “Four Numbers Game,” the experienced reader expects a pretty easy road ahead.
What, you ask, is the “Four Numbers Game”? Draw a square. (It helps if your square is fairly large.) Label each corner with a non-negative integer. Now you have a “start square.” Get a new square by joining the midpoints of each side. Now, get a new numbered square by labeling these midpoints with the difference between the values at the endpoints of that side.
Repeat the process on your new square. Continue repeating the process until all your labels are zero. The “game” is to try to find a start square that takes a long time to reach the state for which all the labels are zero. For example, if you chose initial labels (clockwise from the upper left) of 9, 7, 5 and 1, then after one iteration, your labels will be 2, 2, 4 and 8, and after seven iterations, all your labels are zeros.
To begin to analyze this problem, we note that each start square is equivalent to several others. First, we have the start squares that are equivalent by virtue of the symmetries of a square. Our square (9, 7, 5, 1), for example, is equivalent to (9, 1, 7, 5) and to (7, 5, 1, 9), among others. This allows us to put a start square into a standard form, (a, b, c, d) where the values of the entries satisfy a ≥ b ≥ c ≥ d, a ≥ c ≥ b≥ d or a ≥ b ≥ d ≥ c. We explain this in terms of cosets of the dihedral group on the square as a subgroup of S4. The mathematics is becoming more sophisticated.
A few more generalizations and unexpected applications of new tools lead to the construction of arbitrarily long games, though a probability calculation shows that long games are unlikely. Only one game in ten thousand takes sixteen steps.
Finally, we rewrite the rules of the game as multiplication by a matrix, and extend the legal labels from non-negative integers to real numbers, and use eigenvectors of that matrix to construct games of infinite length.
That was the easy chapter, and it was quite an uphill hike from the easy subtraction game we started with.
Chapter 2 begins with the Pythagorean theorem and some remarks about areas of triangles, takes us through the representation of integers as the sum of two squares, congruent numbers, elliptic curves, and what would be consequences if the Birch and Swinnerton-Dyer conjecture were shown to be true.
Another chapter starts with Pick's theorem, the familiar theorem in elementary geometry that relates the area of a polygon with vertices on lattice points of the plane to the number of lattice points on the edge and on the interior of the polygon. There we take an excursion through Farey sequences, lattice hypercubes, one of Minkowski's theorems, and an elegant object called an Ehrhart polynomial.
The two remaining chapters start with similarly elementary ideas, rational approximation and geometric dissection, and again take us on the fast track to modern research issues.
So, what is different about this book? It starts easy, and then it gets harder fast. In most books, this is a flaw, but our authors lead us up the steeper parts of the path honestly. They tell us what they are skipping and where to fill in the details, and they don't apologize when it gets hard.
I learned a great deal by reading this book, much more than I'd expected to learn when I agreed to review it. (Also, it took a lot longer than I'd expected, four months when I'd expected more like six weeks.) The extra effort was well worth it. I learned the solution to Hilbert's third problem. I'd known the statement, but not the solution. I learned how the Birch and Swinnerton-Dyer conjecture would solve easy-to-state yet hard-to-solve problems. I learned a straightforward explanation of the Banach-Tarski paradox.
Overall, Roots to Research delivers what it promises. It takes elementary problems, easy to state, easy to begin to solve, and shows how their extensions lead to serious research problems in modern mathematics. Be warned, though. It starts easy, but it gets lots harder. It's worth the effort.
Ed Sandifer (firstname.lastname@example.org) writes the column “How Euler Did It ” for MAA Online. He is professor of mathematics at Western Connecticut State University and has run the Boston Marathon 35 times.