This is a survey of some of the major trends in computer science from roughly the years 1970 to 1990. As such it covers mostly computer language design, software design, computer architecture, and artificial intelligence. This is not a mathematical book, although there are some allusions to mathematical thinking, particularly in the discussions of program verification (proving). The “first age” of computer science, from 1819 to 1970, is covered in the author’s earlier book It Began with Babbage: The Genesis of Computer Science (Oxford, 2014).
The coverage is slanted heavily toward the academic, with not much attention to the commercial growth of hardware and software during this period, although it does cover some of the more important commercial computers. This is a sort of memoir of computer science, that covers a lot of the interesting parts, often in considerable detail, but it is not a complete history. It’s a very erudite book, but as I was reading I was often puzzled over who the audience would be.
There is a considerable philosophical flavor, particularly in the later chapters dealing with artificial intelligence and massive parallelism. At the same time, it covers a number of esoteric subjects, such as horizontal microprogramming and systolic processors, that even most practitioners from that era would not have been familiar with.
The book is heavily footnoted and seems to be very accurate. The most glaring problem was the consistent misspelling of the names of two of the era’s pioneers: Heinz Rutishauser’s last name is consistently misspelled Rutihauser, and Niklaus Wirth’s first name is consistently misspelled Nicklaus.
Allen Stenger is a math hobbyist and retired software developer. He is an editor of the Missouri Journal of Mathematical Sciences. His personal web page is allenstenger.com. His mathematical interests are number theory and classical analysis.