Ivars Peterson's MathTrek
February 23, 2004
All this activity adds up to ingenious strategies for collectively working out the shortest path to a food source, combining forces to move a large, unwieldy object, and performing other functions crucial to an ant colony's well-being.
In effect, astonishing feats of teamwork emerge from a large number of unsupervised individuals following a few simple rules. It's an example of self-organizing cooperative behavior, and it's found among ants, bees, and other social insects.
A similar type of teamwork appears to occur in plants. Plants may use a form of distributed computation to figure out how wide to open the pores, or stomata, in their leaves (see Computation's New Leaf).
The behavior of leaf pores resembles that of mathematical systems known as cellular automata. A cellular automaton consists of a collection of units called cells, each of which can be in one of several states. Over time, the cells change their states according to rules that depend on their current states and those of their neighbors.
One cellular automaton that vividly demonstrates how a set of simple rules can lead to a complex domain displaying a rich assortment of interesting behavior is the Game of Life. It was invented by British mathematician John Conway, now at Princeton University.
Conway's aim was to create a cellular pastime based on the simplest possible set of rules that would still allow the game to be unpredictable. Moreover, he wanted the rules to be complete enough so that once started, the game could play itself. Growth and change would occur in jumps, one step inexorably leading to the next.
The game that Conway and his students at Cambridge University came up with is played on an infinite grid of square cells. Each cell is surrounded by eight neighbors, four along its sides and four at its corners.
A cell is initially marked as either occupied (alive) or vacant, creating some sort of starting configuration. Changes occur in jumps, with each cell responding according to the game's rules.
Any cell having two occupied cells as neighbors stays in its original state. A cell that is alive stays alive, and one that is empty stays empty. Three living neighbors adjacent to an empty cell leads to tricellular mating. A birth takes place, filling the empty cell. In such a neighborhood, a cell already alive continues to live.
However, an occupied cell surrounded by four or more living cells is emptied. Unhappily, death also occurs if none or only one of an isolated living cell's neighbors is alive. These simple rules engender a surprisingly complex world that displays a wide assortment of interesting events and patterns. Many different forms evolve on the Life grid. Some of the arrangements vegetate in a single, contented state. Others pulsate, switching back and forth between one configuration and another.
One variant of the game is to play it on a finite grid, with boundaries wrapped around in a periodic fashion. The idea is to mark the cells randomly, then let the system evolve according to the rules until it comes to a stop. Live individuals are occasionally added to the grid as the game proceeds. Under these conditions, the Game of Life appears to operate as a grossly simplified but reasonable model for biological evolution.
In studying the behavior of leaf pores, physicist David Peak, biologist Keith A. Mott, and their coworkers at Utah State University in Logan found that stomata activity resembles that of cellular automata able to perform specific computational tasks. In effect, the local rules lead to a particular outcome of interestthe answer to a specific problem.
One example of such emergent, distributed computation is the density classification task. A cell can be in one of two states: 0 (black) and 1 (white). According to one possible definition of the task, the "correct answer" corresponds to the states of all N cells of the system becoming 0 whenever 0 is in the majority, or all 1 otherwise.
What to do is easy to determine if you can just look down on the grid and count how many cells there are in a particular state. If you can see only your neighbors, the task is trickier to accomplish.
However, there are rules governing interactions between neighbors that allow the correct answer to emerge. These local interactions quickly organize regions of mixed states into regular patchesregions of all black, all white, or black-white checkerboard patterns that grow and shrink and compete with each other for dominance.
The best sets of rules available for density classification lead quickly to the correct answer for most initial configurations. However, when the number of white cells is nearly the same as the number of black cells to start with, the system can take a long time to settle into a particular pattern. Indeed, very small changes in the initial state can produce very different types of behavior in the system.
The same sort of behavior can occur among leaf stomata.
In bright light, a leaf's pores tend to open. This allows the plant to take in carbon dioxide for photosynthesis. At the same time, the plant also experiences increased water evaporation and runs the risk of dehydration. It has to balance carbon dioxide input against water loss by adjusting pore size.
A change in humidity prompts the plant to respond by readjusting its stomata. Experiments show that, under certain conditions, a small change in humidity can have a significant effect on how a leaf's stomata behave. In such cases, a leaf may take a long time to settle into a new aperture pattern, sometimes even oscillating between one pattern and another for a long time.
"If the analogy between leaf and classifier is apt," Peak and his colleagues note, "then the rare long-transient episodes of dynamic patchiness correspond to particularly hard problems that the leaves struggle to solve."
Copyright © 2004 by Ivars Peterson
Klarreich, E. 2004. Computation's new leaf. Science News 165(Feb. 21):123-124. Available at http://www.sciencenews.org/20040221/bob10.asp.
Peak, D., J.D. West, S.M. Messinger, and K.A. Mott. 2004. Evidence for complex, collective dynamics and emergent, distributed computation in plants. Proceedings of the National Academy of Sciences 101(Jan. 27):918-922. Abstract available at http://www.pnas.org/cgi/content/abstract/101/4/918.
Peterson, I. 2000. Calculating swarms. Science News 158(Nov. 11):314-316. Available at http://www.sciencenews.org/20001111/bob1.asp.
______. 1998. The Mathematical Tourist: New and Updated Snapshots of Modern Mathematics. New York: W.H. Freeman.
Wolfram, S. 2002. A New Kind of Science. Champaign, Ill.: Wolfram Media. Available at http://www.wolframscience.com/nksonline/.
You can find out more about complexity and stomatal behavior at http://bioweb.usu.edu/kmott/Complexity_Web_Page/complexity_homepage.htm .
Additional information about the Game of Life is available at http://mathworld.wolfram.com/Life.html. Information about cellular automata can be found at http://mathworld.wolfram.com/CellularAutomaton.html.
Comments are welcome. Please send messages to Ivars Peterson at firstname.lastname@example.org.
A collection of Ivars Peterson's early MathTrek articles, updated and illustrated, is now available as the Mathematical Association of America (MAA) book Mathematical Treks: From Surreal Numbers to Magic Circles. See http://www.maa.org/pubs/books/mtr.html.