Alan Turing is arguably the most widely recognized name in computer science. He has long been recognized for his pioneering contributions to theoretical computer science and artificial intelligence. Now, with Jack Copeland's help, his stature in history is moving beyond the legendary to the mythic, becoming in many ways, the William James of computing.
Copeland's latest edited volume, Alan Turing's Automatic Computing Engine: The Master Codebreaker's Struggle to Build the Modern Computer, following quickly on the heels of his The Essential Turing, offers a comprehensive history of the ACE computer, the National Physical Laboratory (NPL), and of Turing's genius. Collecting personal essays of some of the contributors to the ACE, scholarly historical syntheses, and technical reports (some previously unpublished by Turing), Jack Copeland has done a masterful job of editing pieces by 19 different authors into a logical structure that tells a cohesive story.
Organized into five sections, the book begins with four chapters on the origins of the ACE project and the Mathematics Division of the NPL. Exploring the political and technical landscape of post-war Britain, this first section introduces the people responsible for the development of the ACE (several of whom contributed chapters to this book), traces the timelines and paths of influence of the various early efforts to build computing machines, both English and American, and lays out the challenges and obstacles that prevented the ACE from being the world's first stored-program electronic digital computer.
Part II, Turing and the History of Computing, makes up the next 4 chapters (chapter 8, Computer architecture and the ACE computers, by Robert Doran is oddly missing from the table of contents). It is here that we see most clearly why Alan Turing's name appears in the book's title. This is best illustrated, perhaps, by elaborating on my reference to William James above. In many circles James is regarded as the father of modern cognitive psychology, having lived and thought in the days before psychology was formalized academically. Nevertheless his writings, when viewed from a modern perspective, anticipate both in scope and depth much of what is currently understood about the mind. So much so that he is now one of the most widely quoted historical figures for support of current theories in cognitive science. In a similar manner, this volume credits Turing with prefiguring RISC architecture, microcode, virtual machines, artificial life, evolutionary algorithms, and more. It is also suggested that he, rather than John von Neumann, deserves the credit for the stored-program concept. While all of this is done with scholarly sincerity, and even some polite debate amongst the contributors, it nonetheless paints Turing as the giant upon whose shoulder all who followed have stood.
The next two sections are devoted to the more technical aspects of the ACE and its siblings, the Pilot Model ACE and the DEUCE. From instruction formats, to sample program source code, to punched card manipulation, to circuit diagrams, we are treated to a grand tour of the inner workings of the ACE. Perhaps the most fascinating and intricate discussions focus on the delay line storage facility, in which data is represented as sonic pulses in a tube of mercury. So programming the ACE encompassed the tracking of data through delay cycles (within cycles) and timing the sequence of instructions precisely to coincide with the availability of data.
Part V consists of Turing's 1945 technical report proposing the ACE, the lecture series delivered by Turing and Wilkinson in 1946-7, and a 1947 report by Harry Huskey on the state of the art in computing. Each of these provides substantial historical insight to the earliest days of digital computing.
While this book is well-written and enjoyable to read, I don't expect that many of its users will read its 540 pages sequentially. In fact, with a price tag over $140 (and a foul smelling ink) the biggest market for the book is certain to be university libraries. Programs that include a history of computing course, or weave history into computer architecture, machine language, programming language, or digital electronics courses will find this an invaluable resource. Historians will also find the comprehensive cross-references to digital archival materials at http://www.alanturing.net to be of enormous value.
David J. Stucki teaches computer science and mathematics at Otterbein College, in Westerville, Ohio. His most recent interests are in the history and philosophy of mathematics, computer science education, and algorithmic number theory, although he also maintains an interest in artificial intelligence, theory of programming languages, and foundations/theory of computation. He has participated in Otterbein's Mathematical Problem Solving seminar and has helped to coach the Otterbein teams participating in the annual ECC Undergraduate Mathematics Competition.