 January 2004

# The mathematics of human thought

This year marks the 150th anniversary of the publication of the book that set the scene for the introduction of the computer a century later: George Boole's The Laws of Thought, first published in 1854. The dramatic breakthrough that the book represented is reflected today in our use of the terms "boolean logic" or "boolean algebra" to mean the combination of ideas using the operations AND, OR, and NOT, and our use of the term "boolean search" to mean a database or Web search involving combinations of key words using AND, OR, and NOT. (The fact that we generally do not capitalize "boole" in those contexts indicates just how pervasive Boole's influence has been.)

Boole's book begins with these words:

The design of the following treatise is to investigate the fundamental laws of those operations of the mind by which reasoning is performed; to give expression to them in the symbolic language of a Calculus, and upon this foundation to establish the science of Logic and construct its method.
By the phrase "the symbolic language of a Calculus" Boole meant algebra. Not just the use of algebraic symbols like x, y, z, p, q, r to denote unknown words, phrases, or propositions. That much had been done by the logicians of ancient Greece. What Boole was talking about was using the entire apparatus of the high school algebra class, with operations such as addition and multiplication and the employment of methods to solve equations. Boole's algebra required the formulation of a symbolic language of thought. Solving an equation in that language would not lead to a numerical answer; it would give the conclusion of a logical argument. His algebra was to be an algebra of thought.

Even today, in the twenty-first century, when we are familiar with computers -- the "thinking machines" that are direct descendants of Boole's logical algebra -- it seems an audacious idea to write down algebraic equations that describe the way we think. What led Boole to propose such a thing, and why did he think it might be successful?

George Boole was born in England in 1815. Though the world was to regard him as a mathematician -- indeed, as one of the most influential mathematicians of all time -- he shared his interests between mathematics and psychology. Were he alive today, he would undoubtedly refer to himself as a cognitive scientist, a term that was first used in the early 1950s. He was largely self taught, and it may have been the absence of a teacher to lead him away from such a seemingly nonsensical idea that enabled him to seek to capture the patterns of thought by means of algebra. The mark of his genius is that he succeeded to such an extent.

Boole first published his algebra of thought in 1847 in a small pamphlet entitled The Mathematical Analysis of Logic. The simplest way to describe the contents of this pamphlet is to quote from the opening section.

They who are acquainted with the present state of the theory of Symbolic Algebra, are aware that the validity of the processes of analysis does not depend upon the interpretation of the symbols which are employed, but solely upon the laws of their combination. Every system of interpretation which does not affect the truth of the relations supposed, is equally admissible, and it is thus that the same processes may, under one scheme of interpretation, represent the solution of a question on the properties of number, under another, that of a geometrical problem, and under a third, that of a problem of dynamics or optics. ... It is upon the foundation of this general principle, that I purpose to establish the Calculus of Logic ...
It is worth reading through the above passage a second time. Boole made every word count.

As a result of his new algebra of logic, in 1849 Boole was appointed to the chair of mathematics at the newly founded University College, Cork. As soon as he had established residence in Ireland, he began work on a larger book about his new theory. He was particularly keen to ensure that his mathematics really did capture laws of mental activity, and to this end he spent a great deal of time reading psychological literature and familiarizing himself with what the philosophers had to say about mind and logic.

He used his own money and that of a friend to publish his second, more substantial book on his ideas in 1854. Its full title was An Investigation of the Laws of Thought on Which are Founded the Mathematical Theories of Logic and Probabilities. but it is generally referred to more simply as The Laws of Thought. By and large, the only substantial difference between the 1854 book and the earlier pamphlet of 1847 was the addition of his treatment of probability, using his new algebraic framework. The logic itself was largely unchanged.

Boole's idea was to try to reduce logical thought to the solution of equations -- a logical holy grail ever since the German mathematician Gottfried Leibniz had tried to do it in the 17th century. Leibniz attempted to develop an "algebra of concepts", in which algebraic symbols had denoted concepts, such as big, red, man, woman, unicorn, but he had met with only limited success.

Boole wanted his algebra to encompass all of Aristotle's insights into human reasoning (the famous Greek "All men are mortal" syllogisms) as well as the Stoics' logic of propositions (what we now refer to as propositional calculus). He took his symbols x, y, z, etc. to denote arbitrary collections of objects. For example, the collection of all men, the collection of all mortals, the collection of all bankers, or the collection of all natural numbers. He then showed how to do algebra with symbols that denote collections -- to write down and solve equations -- in a way that corresponds to performing logical deductions.

In order to be able to write down and solve algebraic equations involving collections, Boole had to define what it meant to add and to multiply two collections. Since his algebra was intended to capture some of the patterns of logical thought, his definitions of addition and multiplication had to correspond to some basic thought processes. Moreover, it would be easier to do algebra if he could define addition and multiplication in such a way that they had many of the familiar properties of addition and multiplication of numbers, making his new algebra of thought similar to the algebra everyone was used to.

Here is what he did. Given collections x and y, Boole denoted the collection of objects common to both x and y by xy. For example, if x is the collection of all Germans and y is the collection of all sailors, then xy is the collection of all German sailors.

Boole's definition of addition was more complicated than it needed to be, so other mathematicians of the time modified it to the following simple idea: x + y is the collection of objects that are in either x or y or both. For example, if x is the collection of all red pens and y is the collection of all blue pens, then x + y is the collection of all pens that are either red or blue.

With these definitions of multiplication and addition, Boole's system had the following properties:

x + y = y + x

xy = yx

x + (y + z) = (x + y) + z

x(yz) = (xy)z

x(y + z) = xy + xz

These equations should look familiar for ordinary arithmetic, where the letters denote numbers. They are the two commutative laws, the two associative laws, and the distributive law. Because of the similarities between Boole's algebra of collections and ordinary arithmetic, Boole was able to perform calculations in his system, i.e., algebraic manipulations such as solving equations. However, solving an equation in Boole's system corresponds not to arithmetic but to logical reasoning about ... well, about whatever the symbols are taken to mean -- men, women, unicorns, what to prepare for dinner, etc. True, solving Boolean equations is not necessarily the best way to make a human decision. But the point was that patterns of logical thought could be represented by means of algebra. How far that would get you in real life was a question for later generations to take up.

There are further similarities between Boole's system and ordinary algebra. For instance, in ordinary arithmetic the number 0 is special: adding 0 to any number leaves the number unchanged. In order for his algebra to work, Boole also needed a zero. He obtained it by taking 0 to be the empty collection.

One advantage of having a 0 is that it provides a way to write down an algebraic equation saying that various things do not exist. For example, in Boole's algebra we can express the fact that unicorns do not exist by letting x be the collection of all unicorns and writing down the equation x = 0.

With 0 defined as the empty collection, the symbol 0 has the same special properties in Boole's algebra of collections as it does in ordinary algebra:

x + 0 = x

x0 = 0

for any collection x.

Although Boole's algebra had many of the properties of ordinary algebra, it was not exactly the same. Boole really did have to work with a strange, new kind of algebra. For instance, in Boole's algebra, the following two equations are true:

x + x = x

xx = x

These equations are certainly not true for ordinary arithmetic.

Incidentally, the axiomatic system that today's mathematicians refer to as a "boolean algebra" is not due to Boole. Rather, it was developed by other mathematicians who built on Boole's original work.

By reducing reasoning to doing algebra, Boole opened up the possibility of building a reasoning machine. Even today, it is hard to imagine any kind of mechanical or (these days) electronic machine being able to reason the way humans do about, say, local politics. What can a machine possible know about local government? On the other hand, even in Boole's day it seemed perfectly possible to construct a machine that could manipulate algebraic symbols according to some general rules.

Indeed, the rules Boole presented for manipulating algebraic expressions and for solving equations in his system were sufficiently mechanical that the English logician W. S. Jevons was able to use them to build a mechanical reasoning machine which he demonstrated to the Royal Society in 1870. Not surprisingly, given the prevailing technology at the time, Jevons' device looked for all the world like an old style mechanical cash register. But for all its antiquated appearance, as an implementation of logic it was a stunning early ancestor of the modern electronic computer.

Today's electronic computer is, at heart, just an implementation in silicon of Boole's algebra of thought, with streams of electrons performing Boole's algebraic operations. The OR gates and AND gates you can read about in books that describe how computers work correspond directly to Boole's algebraic operations of addition and multiplication. In last month's column I described how the mathematician John von Neumann played a key role in the the design of one of the first electronic computers in the early 1950s. It was the theoretical work of George Boole a century earlier that prepared the foundations upon which von Neumann and this colleagues helped usher in today's computer era.

This months' column is abridged from my book Goodbye Descartes: The End of Logic and the Search for a New Cosmology of the Mind, published by John Wiley in 1997. For more on George Boole, and the development of logic and its role in the invention of the modern computer, consult that book.

For more in-depth coverage of the use of language in mathematics, but still at an elementary level, see my book Sets, Functions, and Logic, the Third (completely revised) Edition of which has just been published by Chapman and Hall.

Devlin's Angle is updated at the beginning of each month.
Mathematician Keith Devlin ( devlin@csli.stanford.edu) is the Executive Director of the Center for the Study of Language and Information at Stanford University and "The Math Guy" on NPR's Weekend Edition. His most recent book is The Millennium Problems: The Seven Greatest Unsolved Mathematical Puzzles of Our Time, published by Basic Books (hardback 2002, paperback 2003).