Devlin's Angle

November 2007

The Smallest Computer in the World

Long before there were iPod Nanos and laptops, in fact, long before before Bill Gates was even born, a British mathematician proved that it was possible to build computing machines that could be programmed to carry out any calculation that could ever arise. The mathematician was Alan Turing, and the theoretical device he invented in the 1930s is nowadays called a Turing machine. (There is not just one Turing machine; rather they are an entire class of hypothetical computing devices. The Wikipedia entry at http://en.wikipedia.org/wiki/Turing_machine provides a good introduction for anyone not familiar with the concept.) The second world war gave Turing an opportunity to put his theory into practice, and he spent the war years at the British secret code breaking facility in Bletchley Park, building the real computer that was used to break the German Enigma code, a major turning point in the war.

The computer built at Bletchley Park, like all computers since, is much more complicated than Turing's theoretical machine. In principle you could build a physical Turing machine, and in fact several people have done so, but they are not useful; it could take tens or hundreds of years to program them to carry out useful calculations, and for some problems even longer to find the answer. They may be simple, but they are not practical. Their importance is that they help us understand what computation is and how it works. (For instance, the security of your credit card number when you order something online depends crucially on mathematics involving Turing machines.)

The idea of building machines to do arithmetic (in particular) is almost as old as mathematics itself. The earliest attempts, such as piles of pebbles or the abacus, were not really carrying out the mathematical computation, they simply provided a convenient way to store numbers for the human operator who actually did the math. Blaise Pascal (1623 - 1662) was one of the first people to design and build a machine that really did carry out calculations, called the Pascaline. Charles Babbage (1791 - 1871) is another famous name in the history of designing calculating machines. But all of the early attempts focused on designing machines to perform particular calculations or particular kinds of calculations. No one thought of building a machine that could carry out any computation.

Part of the reason is that it was not until the 1930s that anyone had tried to specify precisely exactly what is meant by "a computation." Turing invented his "Turing machine" concept in order to provide a precise definition: a function from numbers to numbers (say) is declared to be "computable" if and only if it can be calculated by some Turing machine.

At first encounter, this definition seems excessively narrow. Turing machines are extremely simple computing devices. What about functions that can be computed, but only by using a machine that is more complicated than a Turing machine? Well, there's a funny thing that happens at this point. Neither in Turing's time nor at any time since then has anyone produced a single example of a function that is obviously (or provably) computable (by some device or other) yet cannot be computed on a Turing machine. Moreover, around the same time as Turing was doing his work, a number of other mathematicians (among them Kurt Godel, Stephen Kleene, and Alonzo Church) formulated alternative formal definitions of "computable function," and all turned out to be completely equivalent to Turing's notion. Thus, a consensus was quickly formed in the mathematical community that Turing's definition of computable really does capture our more intuitive notion of what it means to be "computable." This consensus, which remains to this day, is called the Church-Turing Thesis. It replaces an intuitive notion with a precisely defined mathematical one.

One important result that Turing proved about his new concept was that it was not necessary to specify a particular Turing machine for each given computation. It was possible to build what he called a "universal Turing machine," that could interpret a "program" fed into it as data in order to calculate any computable function it was presented with. Today we are of course familiar with the idea that computers are generally programmable, doing Word processing one minute, mathematics the next, and stealing songs a third. But back at the time of Turing's work, this was a novel idea, albeit only a theoretical one at the time.

One fascinating question about universal Turing machines is just how simple can a universal Turing machine be (and still be capable of carrying out any possible computation)? Some work was done on this in the 1950s and 60s, and it was shown that a Turing machine that has just two internal states and computes on just two symbols cannot be universal. (Real computers today work in binary, with just two symbols, but have a large number of possible internal states.) The computer science pioneer Marvin Minsky at MIT constructed a universal Turing machine with 7 states, computing on 4 symbols.

Then, in the 1990s, mathematician Stephen Wolfram, perhaps best known as the developer of the software program Mathematica, managed to construct a universal Turing machine with two states that computes on 5 symbols. He also proposed a possible candidate for an even simpler one, one that would be the simplest possible universal Turing machine. Wolfram's machine has just two different internal states and carries out all computations on just three symbols. It is known as Wolfram 2,3.

Wolfram had reason to believe that his computer was indeed able to perform any possible computation, but was not able to prove it. He described the problem in his 2002 book A New Kind of Science, and that led to various attempts to find a proof, but no one succeeded. Earlier this year, Wolfram offered a prize of $25,000 for the first person to solve it.

Now, just a few months later, a 20-year-old university student in Birmingham, England, Alex Smith, has found a solution. His proof, which is available on the Web, occupies over 50 pages of mathematical reasoning.

Alex is the oldest of three children. Both his parents are lecturers at the University of Birmingham, where he is an undergraduate in Electronic, Electrical and Computer Engineering. He started using computers when he was six years old, and knows around 20 programming languages.

You can find a PDF file of Smith's proof, plus a lot more detail on Wolfram's problem, on the Wolfram website at www.wolframscience.com/prizes/tm23/.


Devlin's Angle is updated at the beginning of each month.
Mathematician Keith Devlin (email: devlin@csli.stanford.edu) is the Executive Director of the Center for the Study of Language and Information at Stanford University and The Math Guy on NPR's Weekend Edition. Devlin's most recent book, Solving Crimes with Mathematics: THE NUMBERS BEHIND NUMB3RS, is the companion book to the hit television crime series NUMB3RS, and is co-written with Professor Gary Lorden of Caltech, the lead mathematics adviser on the series. It was published in September by Plume.