Note: This special Millennium edition of Devlin's Angle is much longer than normal. You might want to print it out to read. That will also give you a version absolutely impervious to any Y2K bug.
The dawn of a new year provides us with a reminder that we live much of our lives by the clock and the calendar. The message is even stronger when the new year is the last of a millennium, carrying the number 2,000. But what exactly is time? There are three answers: one in physics and philosophy (time as a physical phenomenon), another in psychology (our sense of passing time), the third in mathematics and engineering (the time that we measure and use to regulate our lives). Devlin's Angle will, of course, concentrate on the last of these three notions. How did we come to measure time in the first place? What exactly is it that our timepieces measure? (This is where mathematics comes in.) And what scientific principles do we use to construct ever more accurate clocks? (More mathematics here.)
The measurement of time began with the invention of sundials in ancient Egypt some time prior to 1500 B.C. However, the time the Egyptians measured was not the same as the time today's clocks measure. For the Egyptians, and indeed for a further three millennia, the basic unit of time was the period of daylight. The Egyptians broke the period from sunrise to sunset into twelve equal parts, giving us the forerunner of today's hours. As a result, the Egyptian hour was not a constant length of time, as is the case today; rather, as one-twelfth of the daylight period, it varied with length of the day, and hence with the seasons. It also varied from place to place on the surface of the Earth. And of course, time as a measurable concept effectively ceased during the hours of darkness.
The need for a way to measure time independently of the sun eventually gave rise to various devices, most notably sandglasses, waterclocks, and candles. The first two of these utilized the flow of some substance to measure time, the latter the steady fall in the height of the candle. All three provided a metaphor for time as something that flows continuously, and thus began to shape the way we think of time.
Though their accuracy was never great, these devices not only provided a way to measure time without the need for the sun to be visible in the sky, they also provided the basis for a concept of time that did not depend upon the length of the day. But it was to be many centuries before advantage was taken of that possibility. Instead, each of these time-measuring devices carried elaborate systems of markings designed to give the time based on the sundial. Fragments of one thirteenth century waterclock found in France gave instructions on how to set the clock for every single day of the year! Because the hours of darkness are the antithesis of the daylight hours, the scale for the nighttime hours was simply the daytime scale for the day exactly half a year earlier. For example, the scale for the nighttime on July 1 was the daytime scale for January 1.
In addition to their lack of accuracy, sandglasses, waterclocks and candles were also limited in the total length of time they could measure before having to be reset. As a result, they were largely used for measuring the duration of some activity, such as a speech made by an orator, cooking time, or the length of a legal consultation.
For most of history, ordinary people did not have regular and easy access to any kind of time measuring device whatsoever, other than to glance at the sky on a sunny day and see where the sun was. For them, time as we understand it today did not really exist. The one group in medieval times whose day was ruled by time in a way not unlike people today were the Benedictine monks, with their ecclesiastically regulated prayer times, the eight Canonical Hours: lauds (just before daybreak), prime (just after daybreak), terce (third hour), sext (sixth hour), nones (ninth hour), vespers (eleventh hour), compline (after sunset), and matins (during the night). The signal that announced each canonical hour and regulated the monks' day was a ringing bell. This gives us our word "clock," which comes from the medieval Latin word for bell, clocca.
Regardless of whether they were regulated by a sundial, a waterclock, a candle, or the stars, the bells that were used to signal each new canonical hour were rung according to a schedule based ultimately on the period of sunlight at that location and at that time of year. Because they were not spaced equally apart, the canonical hours provided a concept of time that, in addition to changing throughout the year and from location to location, did not flow evenly as modern time does.
During the Middle Ages, the idea of a regulated time started to spread out from the monasteries along with the associated religious observances. At the end of the fourteenth century, the best selling book in Europe was the Book of Hours, a collection of devotional readings that a well-to-do lay person in his home could read or recite at the appropriate canonical hour.
Today, most people keep themselves and their families alive by selling their time -- explicitly in the case of workers, professionals, or consultants "paid by the hour," less explicitly but no less real for salaried employees. Moreover, present day economies are largely sustained by the lending and borrowing of money, for which the lender charges interest -- a charge for the time the borrower has use of the money. The situation was different in the Middle Ages. Though allowed by the Romans, usury -- the charging of interest on a loan -- was banned from early Christian society until well into the twelfth century, the argument being that time belongs to God and therefore cannot be bought or sold. (Officially, usury is still banned by Islam.)
The growing dependence on international trade from the thirteenth century onward required the support of a money market, and as a result usury gradually crept into the Christian societies of Europe. With the growing acceptance of time as a commodity that could be bought and sold, humanity started along the path of developing a sense of time as something separate from the familiar cycle of night and day and the changing of the seasons. As time became rationalized, it also grew more secular, part of the daily activities of commerce, industry, and daily life.
It was into a world of "natural time," based on the sun's march across the sky, and varying with the seasons, that the first mechanical timepieces -- time machines -- were introduced in thirteenth century Europe. At odds with the conception of time as something that flows, with the first clocks came the idea of measuring time by splitting it into equal, discrete chunks and counting those chunks.
Most of us think of the time produced by our clocks as time itself. Yet the only thing natural about the time produced by clocks is that it is originally based on a complete revolution of the earth (or more precisely, the average of such revolutions). The division of that period into 24 equal hours -- generally treated as two successive periods of 12 hours each (AM and PM), the division of each hour into 60 minutes, and the further division of each minute into seconds are all conventions -- human inventions.
In fact, there's a fundamental circularity in the way we measure time. The time that is measured by a clock is itself produced by that clock. The clock's time is independent of the flow of the seasons or the cycle of day and night, and is independent of the clock's location on earth. Today we don't give this matter any thought -- time is what the clock tells us. But in the early days of clocks that was not the case. Indeed, so different was the time determined by the clock that the practice developed of indicating when a time given was produced by the clock by adding the phrase "of the clock" -- later abbreviated to the "o'clock" we use today.
With the invention of the clock, the basic unit of time ceased to be the day and was replaced by the hour. With clocks, people could correlate their activities to a far greater degree than ever before. And the ability to measure time in a mathematical fashion helped prepare the way for the scientific revolution that was to follow three hundred years later.
All clocks depend on the laws of physics, which provide potentially reliable timepieces in the form of oscillators. Any object that oscillates will have a preferred period of oscillation, and by finding a way to capitalize on that regular period, a reliable clock may be constructed.
Early oscillating mechanisms were called escapements. The first escapement, the "verge-and-foliot," comprised a freely swinging horizontal bar (the foliot) attached to a centrally located vertical shaft (the verge). The mechanism was driven by gravity. A heavy weight hung from a cord wrapped round a horizontal spindle. As the weight slowly descended, the cord turned the spindle. A toothed crown-wheel on the spindle made the escapement oscillate, the escapement regulated the rate at which the spindle turned, and the rotation of the spindle measured the passage of time by moving a hand around a marked clock face. (The rate of oscillation, and hence the speed of the clock, was adjusted by moving symmetrically-placed small weights along the foliot bar.)
Some time in the fifteenth century, clockmakers started to use tightly coiled blades of metal -- springs -- to power their timepieces, instead of gravity. Following Galileo's famous 1583 observation that the period of oscillation of a swinging pendulum seemed to depend only on the dimensions of the pendulum, not on the size of the arc, the verge-and-foliot escapement was modified -- and improved -- so that the swing of a pendulum arm regulated the motion. The pendulum clock was itself improved when the verge-and-foliot mechanism for controlling the rate of rotation of the crown wheel was replaced by the anchor escapement, where a caliper-like "anchor" performed the task previously carried out by the verge-and-foliot.
Despite the various improvements, most early clocks were highly unreliable. This was of little consequence, however, since they could be checked and adjusted regularly by reference to the sun. Thus, despite the technology and the mechanical nature of the time it produced, time was still ultimately dependent on the sun.
But by the middle of the seventeenth century, pendulum clocks with an anchor escapement were being manufactured that were accurate to within ten seconds per day. This was far more precise than reading the time from a sundial. Not only was it not easy to read the time accurately from a sundial, the speed of the sun across the sky varied slightly from one day top the next. Indeed, with the availability of precise time machines, it became possible to measure the variation in the sun's speed. It was then, at the start of the scientific revolution, that people effectively started to live by mechanical time. Though, for the vast majority of the population, the sun would continue to provide the principal means of telling the (approximate) time, the definitive time was that provided by the clocks. From then on, clocks were used to set and calibrate sundials, rather than the other way round as had previously been the case.
The introduction of accurate clocks not only provided an accurate way to measure and tell the time, it also enabled sailors to make us of the variation of time with longitude to determine their position when at sea.
In the fifteenth century, when explorers such as Christopher Columbus and Amerigo Vespucci first started to sail into the great oceans, they faced a major hurdle: How could they keep track of their position? For earlier generations of sailors, such as the Mediterranean and Northern European traders, there was no such problem -- they always kept close to a shoreline. From early times, charts called portolans (harbor guides) were available to provide details required by the coastline-hugging sailor -- depth of the water, location of treacherous rocks, special landmarks, et cetera. But how do you keep track of your position when you have left the shoreline behind?
Part of the answer was provided by Greek geographers of the third century B.C., who used astronomical calculations to draw three reference lines on their world maps -- the three lines of latitude known nowadays as the equator and the tropics of Cancer and Capricorn. Eratosthenes subsequently added further east-west lines of latitude, positioned to run through familiar landmarks. A century later, Hipparchus made the system more mathematical regular, by making the lines equally spaced and truly parallel, not determined by the lay of the land or by places that people found important. He also added a system of north-south lines of longitude, running from pole to pole, and divided the 360 degrees of both latitude and longitude into smaller segments, with each degree divided into 60 minutes and each minute into 60 seconds. (Both the 360 degrees of the circle and the 60-fold division of the degree and the minute come from the fourth century B.C. Babylonian sexagesimal system of counting, adopted because of the ease of subdividing the whole numbers 60 and 360.)
In the second century B.C., the Greek astronomer Claudius Ptolemy wrote eight books on geography, in which he described how to draw maps with lines of latitude and longitude for reference. Ptolemy's manuscript was accompanied by twenty-seven world maps, drawn according to his ideas. (It is not known if Ptolemy himself drew those maps.) Ptolemy made two major errors: his estimate for the circumference of the world was only three-quarters of the true figure, and he extended Asia and India much too far to the east. (It was the combination of these two errors that made Columbus -- who had a copy of one of Ptolemy's maps -- think he could sail west to the Indies, and thereby led to the discovery of America by the Europeans in the fifteenth century.)
To make use of the grid lines drawn on a map, a navigator had to have a way to determine the ship's latitude and longitude. Latitude was not much of a problem. All the sailor had to do was measure the altitude of the sun at noon. This varies with latitude (and with the time of the year), and a simple geometric calculation allows the latitude to be computed from the noon altitude on any given day of the year. Already in the Middle Ages, astronomical tables showed the altitude of the noon sun throughout the year at different latitudes, and a sighting instrument such as a quadrant could be used to measure the altitude. Even in those days, determination of latitude could be made to within half a degree.
But how do you determine longitude? The first practical answer was in terms of speed. If an explorer knew the speed he was traveling, he could compute the distance covered each day, and in that way keep track of his longitude. Columbus had no instrument to measure his speed, so he simply observed bubbles and debris floating past his ship and used those observations to make an estimate of the speed. A better solution was to use time. Even the Greeks had observed that longitude could be regarded as a function of time. Since the earth makes one complete revolution every twenty-four hours, in each single hour it rotates through fifteen degrees of longitude. This means that every degree of longitude corresponds to four minutes of time. If a navigator knew the time at his starting point, and also knew the local time, then by comparing the two times he could compute his current longitude relative to the initial longitude. By carrying a clock on board, all a sailor needed to do to determine his longitude was read the clock's time at (local) noon (i.e., when the sun was at its highest point) and convert the clock's discrepancy from noon to give the ship's longitude relative to the starting longitude. Every four minutes of difference would indicate 1 degree of longitude to the east or the west. To make this process work, of course, the sailor had to have a reliable clock; moreover, a clock that remained reliable when carried out to sea on a ship.
From the sixteenth century onwards, the need for an accurate clock to determine longitude became so important to growing world trade, that a number of monetary rewards were offered for the first person to produce such a device. In 1714, England's Queen Anne offered 20,000 (several million pounds in today's currency) for the first person to find a way to determine longitude to within half a degree. Many attempts were made to solve the problem and win the various prizes. In 1759, a Yorkshireman called John Harrison tested a 5.2 inch diameter clock on a trip from Britain to Jamaica and back. The clock lost only five seconds on the outward journey, corresponding to a longitude error of only one and a quarter nautical miles. Harrison won Queen Anne's prize, and the world finally had a way to determine longitude: by the accurate measurement of time.
In the case of ocean travel, the development of reliable timepieces brought time and space together, and enabled the traveler to make use of time in order to determine location. For land travel, however, the arrival of accurate clocks created a conflict between time and location.
The first inklings of the problem occurred in Europe in the eighteenth century, with the introduction of mail coach services. Designed to convey passengers as well as mail, the coaches kept to a strict schedule, and as with today's express delivery services, each company needed to maintain a good reputation for reliability and punctuality. The problem with keeping to a strict timetable was that the actual "time of day" varied from town to town. Even in a small country like England, towns to the west of London could be up to twenty minutes behind the capital. Much like today's jet-setting international business executives, coachmen were forever having to adjust their watches to give the correct local time.
The problem became much worse with the arrival of the railway network in the nineteenth century. The greater speeds, together with the need to change from one line to another -- possibly from one railway company to another -- in the course of a single journey made the plethora of different local times a confusing annoyance. In England, the railroads decided that they would run their operations according to London time, as determined by the Royal Observatory at Greenwich, and by 1848 practically all British railroad companies operated according to what would eventually become known as Greenwich Mean Time (GMT). For a while, many local towns continued to keep their own time, determined by local observations of the sun, but gradually the benefits of having a single time began to outweigh tradition and local pride. By 1855, almost all public clocks throughout Great Britain showed GMT.
The method used to synchronize all the clocks brought with it another acknowledgment that time could be a commodity to be sold. The Greenwich Observatory maintained an electrical Standard Clock that defined GMT. Each day, the Observatory took stellar readings to correct the Standard Clock. (Measuring the positions of certain stars at night was a much more accurate way to measure the earth's rotation than trying to identify the moment when the sun was at its midday highest point.) After the invention of the electrical telegraph in 1839, telegraph lines were laid alongside all the major railway tracks. In 1852, the Astronomer Royal, George Arey, instituted a system whereby time signals from the Standard Clock were transmitted along the telegraph lines to electrical clocks at railway stations, government offices, and post and telegraph offices throughout the country. For a fee private subscribers could also be hooked up to receive the time signal. (Clockmakers and clock repairers were major customers of this service.) They were, quite literally, buying time.
In this way, the entire British Isles came to conform to a single system of time, determined by the stars, and distributed along telegraph wires.
The confusion caused by the differences in local times generated in Britain by the introduction of the railroad system was nothing compared to the United States, where, because of the much greater distances involved, the differences could be far greater than a few minutes. As the U.S. railway system grew between 1840 and 1850, most railroad companies operated according to the time of their home city. The result was that, at the height of the temporal confusion that developed, there were around eighty different railway timetables in use around the country.
To try to bring order to the chaos, regional time zones started to develop. For example, by the early 1850s, all New England railroads kept to the same time, determined by the Harvard College Observatory. Likewise, there were standardized time zones around New York, Philadelphia, and Chicago.
The next step toward uniformity occurred in 1869, when a far-sighted individual called Charles Dowd put forward a plan to divide the entire nation into four uniform time zones, each fifteen degrees of longitude wide, and hence each one an hour apart from its neighbor. The time zones were not designed to be adopted by local residents. Rather, they provided a systematic basis for coordinating railroad schedules, and Dowd published timetables that gave the conversions between each local time region and the zonal railroad time. Eventually, however, people started to suggest making railroad time the only time, with the entire nation having just the four zonal times.
The proposal to abolish all the city-based local times caused enormous controversy, with many civic leaders seeing it as a matter of local pride to maintain their own time system. It was not until 1883 that the majority of the country were prepared to make the move to adopt railroad time. The final step was orchestrated by a railwayman called William Allen, who lobbied long and hard for what he saw as an obviously advantageous move. At 12 noon on November 18, 1883, Allen saw his dream come to fruition. At that precise moment, the vast majority of the nation switched to the new time.
The switch was achieved by getting all the different observatories, which regulated the time in their regions, to agree to send the new time signal at precisely the same moment. By then, most towns and cities provided a standard time signal, which in turn was based on a time signal received from an observatory. (A common means of providing the local residents with a daily time signal was by means of a time-ball, a ball that slid down a vertical pole to provide a countdown to noon. Today, New York City uses such a device to provide a ritualistic time signal for the start of the new year at midnight on New Year's Eve.) By coordinating all the time signals, in one fell swoop the entire nation was switched from local time to one of the four zonal railroad times (apart from a small number of renegade regions that vainly held out for a year or so longer).
In 1918 the four-zone time system was legalized. After two thousand years, a completely abstract, man-made, uniform, mathematical notion of "time" was starting to work its way into -- and condition -- our view of the world. But there were still some further developments to take place. First, although, by the late nineteenth century, many countries had adopted uniform time systems, there was hardly any coordination between different nations. In particular, there was the fundamental issue of where to locate the base line for measuring longitude. Unlike latitude, where the earth's axis of rotation determines two poles and a corresponding equatorial base line, there is no preferred baseline for measuring longitude. England used the line of longitude through Greenwich -- where the Royal Observatory was located -- as the zero meridian, and by 1883, Sweden, the United States, and Canada had also adopted the Greenwich Meridian as the baseline.
With the growth of international commerce, discussions started to take place to try to establish a uniform worldwide system for measuring longitude. In 1884, an International Meridian Conference was held in Washington, D.C. to try to resolve the issue. On October 13, the twenty-five participating counties put the matter to the vote. Twenty-two of them voted for Greenwich (San Domingo voted against, France and Brazil abstained). Greenwich was chosen for two reasons. First, the meridian had to lie on a major observatory. Second, because so much international shipping at the time was British, Greenwich was already the most widely used meridian in sea transport, having been adopted by around two-thirds of the world's shipping companies.
The establishment of a worldwide system to measure longitude brought with it a notion of worldwide time. Since there are twenty-four hours in a day and 360 degrees in a circle, each fifteen degrees of longitude represented one hour. Thus, by wrapping a 360 longitudinal grid around the earth, humankind automatically divided the planet into twenty-four time zones, each one hour different from its two neighbors.
Just as the adoption of uniform time in Britain had been brought about by the development of coach travel and the railways and the adoption of uniform time zones in the United States was in response to the growth of rail travel, so too the main impetus for having a uniform worldwide system of measuring time was Marconi's invention of wireless telegraphy in 1899. With instantaneous communications between counties around the world, and between land and ships at sea, it became imperative to have a uniform system of world time.
Surprisingly, this momentous development in human civilization, a major step toward today's "global village," went almost unnoticed at the time. Admittedly, not all countries made the switch to using the Greenwich Meridian straight away. Many countries did not adopt Greenwich until well into the twentieth century, with the last one, Liberia, not making the change until 1972. National pride was the principal inhibiting factor. But ultimately, there was nothing that could stand in the way of what was, quite literally, the march of time.
Today, we live much of our lives "by the clock." We are awakened by an alarm clock, we listen to the radio at a particular time, we travel to and from work at a certain time of day, we attend meetings that start and finish at predetermined times, we eat our meals according to the clock, not simply when we feel hungry, and the clock tells us when to go to a movie, to a concert, to the theater, or to watch our favorite television program. Indeed, not only are most of our daily activities regulated by the clock, they are often ruled down to the precise minute. This way of living is very recent. Not only does it depend on the uniform system of worldwide time measurement, it also requires that each one of us carries on our person a reliable means to keep track of time. The development of first the pocket watch and then the wrist watch also changed the way we view, and live, our lives. Completion of the revolution in human life brought about by the evolution of our concept of time was as much a technological step as an intellectual one. To live according to the regular beat of man-made time, we have to carry time around with us. More accurately, since our present-day watches do not (yet) communicate with each other or with any centralized "time station," we carry around with us a device that manufactures a personal time, that is built to be in synchronization with official time to within a few seconds.
The accuracy (and cheapness) of today's watches and clocks comes from an observation made by the Frenchman Pierre Curie in 1880. Curie noticed that when pressure is applied to certain crystals -- quartz crystals, for example -- they vibrate at a certain, highly constant frequency. Subsequent investigations showed that subjecting crystals to an alternating electric current also caused them to vibrate. The first use of this phenomenon was in the design of radios, to provide a broadcast wave of constant frequency. Then, in 1928, W. A. Marrison of Bell Laboratories built the first quartz-crystal clock, replacing the pendulum and the various other mechanical oscillating devices of previous timepieces by the constant vibrations of the quartz crystal. The quartz clock was so accurate and reliable, that already by 1939 it had replaced the mechanically-regulated clocks at the Observatory in Greenwich.
Though the resulting accuracy was not discernible to human consciousness, the arrival of the quartz clock to measure time changed the nature of time yet again. Since quartz crystals can vibrate at millions of times a second, the underlying basic unit of time provided by our timepieces changed from the second -- the unit provided by mechanical oscillating devices -- to units up to a million times smaller. The meant that our timepieces had developed to the point where time finally broke free of the natural phenomenon with which our very notion of time had originated: the earth's daily rotation. With devices capable of measuring up to a millionth of a second, it was possible to measure the small discrepancy in the earth's rotation from day to day. It no longer made sense to define the second as 1/86,400th of a mean solar day (86,400 = 24 x 60 x 60). Instead, we now base our time on the daily movement of distant quasars. Administered in Paris, this international, astronomical time is called Coordinated Universal Time (UTC).
Astronomical observations provide a stable basis for our modern time. But even quartz-crystal clocks are not sufficiently accurate to provide the precision of measurement required for many present-day technologies. For one thing, no two crystals are exactly alike, and differences in size and shape affect the frequencies at which the crystals oscillate. Also, over time, the oscillating frequency of a given crystal tends to drift, as its internal structure changes slightly. Far greater accuracy is provided by the atomic clock, the first of which was constructed by the English physicists L. Essen and J. Parry in 1955. It makes use of the fact that when suitably energized, the outer electron of a caesium atom flips its magnetic direction relative to the nucleus, in the process emitting or absorbing a quantum of energy in the form of radiation with a constant frequency of 9,192,631,770 cycles per second. The idea behind the atomic clock (or the caesium clock) is to bombard cesium with microwaves of close to 9,192,631,770 cycles per second. The microwaves cause an energy oscillation of exactly 9,192,631,770 cycles per second in the caesium atoms, and that in turn regulates the microwaves, holding them to exactly that frequency -- a simple feedback loop that provides the ultimate, perfect timekeeper. By using the very basis of matter, we can define the second to be 9,192,631,770 ticks of the caesium clock. The official definition, adopted in 1967, is that the second is "9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom."
Although units of time less than about a tenth of a second are not discernible to human consciousness, present-day life depends heavily on the extremely accurate measurement of time provided by quartz clocks and atomic clocks. For example, consider how dependent we are today on broadcast electromagnetic waves for various kinds of communication. Suppose an FM radio station is assigned the broadcast frequency 100 MHz. (That's 100 million cycles per second.) If the station's second is just 1/1,000th different from the true second, its broadcast signal will be off by 100,000 Hz (i.e., 100,000 cycles per second). Without highly accurate timekeeping, our communications network would be chaos. Another example is provided by our current desktop computers, which derive their speed from a highly accurate internal clock capable of measuring (or, if you prefer, creating) extremely short periods of time, currently approaching the 500 MHz range.
A third example of the use of the high accuracy of atomic clocks is provided by the ground-based LORAN-C navigation system and the satellite-based Global Positioning System (GPS), both modern-day versions of the use of time to determine position. GPS, for instance, depends on a network of twenty-four satellites that orbit the earth at an altitude of 11,000 miles. Each satellite continually beams down a signal giving its position and the mean-time determined by the four atomic clocks it carries on board. By picking up and comparing the time signals from four satellites, a ground receiver -- which may be small enough to be hand held -- can compute its latitude and longitude with an accuracy of up to 60 feet and its altitude accurate up to 100 feet. The clocks on the satellites have to be extremely accurate for two reasons. First, the satellite uses its clocks to determine its own position at any instant. Second, the determination of the position of the ground receiver depends on the tiny intervals of time it takes an electromagnetic signal to travel from each of the satellites to the receiver. Since the signal travels at 186,000 miles per second, a timing error of one-billionth of a second will produce a position error of about one foot. The on-board clocks are accurate to one second in 30,000 years. (Ground based atomic clocks can be accurate up to one second in 1,400,000 years.)
In the United States, the U.S. Naval Observatory (USNO) in Washington, D.C. is charged with the responsibility for measuring and disseminating time. American time is determined by the USNO Master Clock, which is based on a system of many independently operating caesium atomic clocks and a dozen hydrogen maser clocks. Their web site at http://tycho.usno.navy.mil/ provides a rich source on information on modern timekeeping.
Though largely hidden from our view, the fine-grained notion of time in use today, based on the movement of massive objects far away in the universe and measured by the tiny quantum energy states of the atom, is quite literally the time of our lives. It affects the very fabric of our daily lives and the way we view ourselves and the world we live in. We live by the clock, and in many ways we are slaves to the clock. Yet in terms of utility, time is something that only exists because we have the means -- both the conceptual framework and the associated technology -- to measure it with reasonable accuracy.
Anyone who has visited Stonehenge -- which among possible other purposes was undoubtedly a timekeeping device -- will have felt the sense of awe at the technological skills of our ancestors many thousands of years ago. With many of today's timepieces barely lasting from one Christmas to the next, will the present generation leave a similar legacy to the future?
Several years ago, the computer scientist Danny Hillis (the designer of the massively parallel computer called the Connection Machine) asked whether modern technology would allow us to build a mechanical clock that would keep running and yield accurate time for at least 10,000 years. Such a device would be a twentieth century legacy to the future, a present-day analogue of Stonehenge.
Hillis first wrote about his idea in 1993: "When I was a child, people used to talk about what would happen by the year 2000. Now, thirty years later, they still talk about what will happen by the year 2000. The future has been shrinking by one year per year for my entire life. I think it is time for us to start a long-term project that gets people thinking past the mental barrier of the Millennium. I would like to propose a large (think Stonehenge) mechanical clock, powered by seasonal temperature changes. It ticks once a year, bongs once a century, and the cuckoo comes out every millennium."
Such a device could do for time what photographs of Earth from space have done for thinking about the environment, Hillis suggests. It could become an icon that reframes the way people think.
With assistance from Whole Earth founder Stewart Brand, the musician and technophile Brian Eno, and others, Hillis established a foundation to support the design, construction, and maintenance of his clock. Eno called it "the clock of the long now," which Brand took as the title for a book on the project, published last year. The Long Now Foundation was officially established in 1996 -- or rather 01996 in Long Now Time, since a 10,000 year clock will have to count years with five digits to avoid a future Y10K problem.
The prototype of the 10,000 year clock Hillis is working on stands eight feet tall, and is constructed of Monel alloy, Invar alloy, tungsten carbide, metallic glass, and synthetic sapphire. (The eventual one may be larger.) The prototype is due to debut on January 1st 2000. It measures time by what Hillis calls a serial-bit adder, a highly accurate binary digital-mechanical system he invented. Its 32 bits of accuracy gives it a precision equal to one day in 20,000 years. It self-corrects by intermittently locking on to the sun. The mechanical power to "wind it up" could be provided by the alternate solar-heating and nighttime-cooling from the daily cycle of day and night, or from the annual warming and cooling of the seasons. An intriguing supplementary -- or even alternative -- source of power Hillis has suggested is to establish an annual "winding of the clock" as a worldwide cultural event.
For a society that lives its life by the clock and by technology, Hillis's Clock of the Long Now would surely be a fitting memorial to mark the close of the twentieth century. Without doubt, it's about time.
For furthe information about Hillis's clock, check out the web site of the Long Now Foundation at http://www.longnow.org/.
Devlin's Angle is updated at the beginning of each month.