Why We Need Calendars and Clocks

All life forms have some innate method for keeping track of time, but humans keep time with greater precision and in more diverse ways than any other species.

Why Bother to Keep Time?

Why do we need clocks and calendars? Looking at our lives today, some of the answers may seem obvious. To survive in this complex society, you need to track what others are doing and when they’re doing it. You also need to know what’s happening in the natural world (what season it is, for example). If you didn’t know the time or date you’d be seriously out of sync with your world. You’d miss a train or walk in late to your big history class.
But it’s not just modern humans that need to keep track of time. All living things have ways of tracking time so they can keep adjusting to their environment as it changes. Bears know when to hibernate, and when to wake up. Plants know when to blossom and fruit, making seeds for the next generation. Many birds know when it’s time to head south for the winter!
In fact, keeping track of time is so important that evolution has built body clocks — some of them especially attuned to the differences in daylight hours that the changing seasons can bring — into all living organisms. These “circadian rhythms,” though not perfectly aligned with our man-made clocks and calendars, work well in nature. Your body clock will tell you that it’s not a good idea to get up at 2 a.m., when it’s pitch dark, unless you have to!

What’s Different About Human Time?

As with many other things, we humans track time differently than other creatures. We’ve developed many intricate ways to measure time, often with incredible precision. And as human societies have become larger and more complex, we have gotten better and more precise about marking the time at varying scales, from the stopwatch precision of the Olympic games to our daily schedules of work or travel to the dates of historical events and even those of geological events that may have happened millions or billions of years ago. To do this, modern humans have had to devise increasingly sophisticated clocks, calendars, and timetables. It wasn’t always this way.

Keeping Time in the Paleolithic Era

If you were a Paleolithic forager living 100,000 years ago, how would you have kept track of time? We have little direct evidence about Paleolithic time-tracking, but we do have some indirect evidence based mainly on studies of modern foraging societies.
In a foraging society, the rhythms of the natural world are critical. You need a pretty good sense of the changing seasons and of the schedules that other species keep so that you can decide when to move to a new campground, what plants to collect, and what animals to hunt. Modern foragers sense such changes with a precision and subtlety no contemporary urban dweller can match.
Keeping track of the time of day and the time of year was not difficult in societies whose members spent most of their time outdoors, as the positions of the Sun and the stars told you all you needed to know. And aligning your activities with those of your family and friends was much less complicated than it is today because people lived in small groups and met face to face.
Meetings with other communities were often seasonal and didn’t require great scheduling precision. If a group normally met with a neighboring tribe “when the reindeer returned” or “after the burdock went to seed,” it didn’t really matter if their schedules were a few days off. Foraging societies were much more forgiving about appointments than most modern city dwellers.
So no special instruments were required for timekeeping. But there are clues that even Paleolithic foragers didn’t rely entirely on their memories and their senses to keep track of time. In South Africa’s Blombos Cave, which was occupied perhaps as early as 100,000 years ago, archaeologists have found chunks of ochre with strange marks on them dating to about 70,000 years ago. These are the oldest known “artworks,” and, though most archaeologists are cautious about interpreting them, it’s tempting to think that the engravings were used to mark the passing of time. Perhaps they were lists of lunar cycles or dates of important rituals. More serious — if not universally accepted — evidence of calendars of some kind comes from about 40,000 years later. The American archaeologist Alexander Marshack (1918–2004) became fascinated by marks on Paleolithic objects, and argued that some of them should be regarded as calendars because they seem to have been tracking the movements of the Moon. In a 1984 lecture at New York’s Museum of Natural History, Marshack talked of his first visit (c. July 1964) to Les Eyzies, a prehistoric site in southwest France:
Professor Movius and I stood on the shelf looking across the valley as the sun went slowly down behind the hills far to the right, sinking as a great red disc. As it was going down, the first crescent of the new moon appeared in the sky as a thin silver arc, facing the sinking sun. It was instantly apparent that the Les Eyzies horizon formed a perfect natural “calendar” and that the first crescent would appear over those hills at sunset every 29 or 30 days...that the sun was sinking at its farthest point north on that horizon, its position at summer solstice, and that it would now begin to move south.... The visual effect of the silver first crescent, aiming its arc at the setting sun and following the summer sun down, was stark and dramatic. There was no way that generations of hunters living on that shelf over a period of 18,000 years or more could fail to notice these periodic changes and movements of the sun and moon.... It took the next 18 years, however, before I could properly put together the seasonal and ecological dynamics of that valley and work out its relations to the art, images, and paintings in the caves.
(“Hierarchical Evolution of the Human Capacity,” pp. 14-16)

Keeping Time in Agrarian Societies

Agricultural societies began to appear from about 11,000 years ago. As they expanded and linked up with their neighbors they needed new and more reliable methods of keeping time. If you wanted to sell some produce in a nearby town or worship at a nearby temple you had to know exactly when the markets and religious rituals were held — and you needed to know in advance. Drifting in a week or two later no longer cut it, so you needed calendars that everyone agreed on and shared. If your village depended on irrigation, everyone needed to know exactly when the irrigation gates would be opened. Similarly, seeds were sown at particular times, and the harvest collected according to seasonal calendars based on Earth’s orbit around the Sun and associated climate patterns. And if you were sowing or harvesting alongside your neighbors, you all needed to agree exactly when to start.
This is why new devices began to appear that could track time more precisely. One method of timekeeping was to watch the Sun’s shadow using sundials. A stick in the ground would often do the job (as long as the Sun was shining!), but some sundials were extremely precise. Time was also measured by how long it took sand to move through a narrow hole in a glass container or by the rate at which water dripped from an urn.
More elaborate instruments were used to track the movements of the stars and planets. It is possible that Stonehenge in England, which was constructed between 4,000 and 5,000 years ago, was designed partly to determine the exact dates of the summer and winter solstices (the days when the Sun reached its highest and lowest points in the sky). The most elaborate and precise of all agrarian-era calendars were probably those of Mesoamerica, which appeared in the first millennium BCE. The Maya calendars, for example, included a 260-day cycle based on biweekly rituals and a 365-day version organized around the agricultural and solar phases. They also had a “long-count” calendar measuring time from the beginning of their civilization. Meanwhile, the Romans developed a calendar with 10 months, and the names they used are mostly familiar (for example, Martius is our March). Eventually, they refined their calendar, adding two more months and even including the concept of a leap day.

Toward the Modern Era

In his book Time: An Essay, the German scholar Norbert Elias argued that, as societies became larger and more complex, people began to require more and more precise clocks and better and more accurate records. This was because more and more individual schedules were getting linked together in networks of increasing complexity. As schedules began to interlace, people had to start thinking about time more precisely and more carefully:
Just as the chains of interdependency in the case of pre-state societies are comparatively short, so their members’ experience of past and future as distinct from the present is less developed. In people’s experience, the immediate present — that which is here and now — stands out more sharply than either past or future. Human actions, too, tend to be more highly centred on present needs and impulses. In later societies, on the other hand, past, present and future are more sharply distinguished. The need and the capacity to foresee, and thus considerations of a relatively distant future, gain stronger and stronger influence on all activities to be undertaken here and now.
(Time: An Essay, p. 144)
Improved methods of keeping time evolved in many different contexts. Monks needed to know when to pray, so they developed various methods, including the ringing of bells. Travelers needed to schedule their departures and arrivals more carefully, and increasingly elaborate clocks were built, some using carefully controlled drips of water, while others used falling weights.
Precise clocks were particularly important for navigators, who needed them to calculate their longitude, or how far west or east they had traveled. Once ships began to travel around the globe, from the late 15th century, the need for accurate timekeeping was well-recognized. Indeed, the British Royal Observatory at Greenwich was commissioned in 1675 to help solve this problem. In 1714 the British government offered a prize of £20,000 (nearly $5 million in today’s equivalent) for the first person to build a clock that could stay accurate to within two minutes during long ocean voyages. Yorkshire-born carpenter and clockmaker John Harrison spent most of his life on the task and was finally awarded the prize in 1773, three years before he died.
In the 19th century, the invention of railways and steamships — and their widespread use — required entirely new levels of precision. With so many passengers and important cargo relying on transportation lines, on-time departures, connections, and arrivals were critical to the whole network. The first English train timetable was published in 1839 and, for the first time, different British cities needed to coordinate their clocks to the same national clock, that of Greenwich Mean Time (GMT), the time at the Royal Observatory. But not until 1880 was Greenwich Mean Time adopted officially throughout Britain. In the United States, regional time zones were not systematized until 1918. At about the same time, the idea of daylight saving was introduced in numerous countries around the world.
International steamships required equally precise coordination across the entire globe. Not until 1929 did most countries begin to link their local time to Greenwich Mean Time — and the Himalayan mountain nation of Nepal waited until the 1980s.
In today’s world of international plane schedules and electronic bank transfers, we need even greater precision, levels so high they can be thrown off by tiny alterations in the rotation or orbit of our Earth. So now, timekeeping depends less on measurements of astronomical phenomena and more on complex devices such as atomic clocks, which measure time using signals emitted by electrons as they change energy levels.
One final breakthrough in timekeeping was particularly important for big history. That was the invention of “radiometric” dating, a suite of techniques for dating past events by measuring the breakdown of radioactive materials.
Before about 1950, the only way to assign an absolute date to a past event was to use written records, and of course these could not be used for any date more than a few thousand years ago. The first workable method of radiometric dating, devised by American chemist Willard Libby in the early 1950s, used the breakdown of an isotope of carbon, C14, to date materials containing carbon. Since then, a whole range of new dating techniques have been developed, and they can now give us reasonably accurate dates for events reaching back to the Big Bang, 13.8 billion years ago.
Accurate timekeeping and recordkeeping are the foundation for histories of all kinds, including big history! Next time you fly or take a bus, be grateful that your pilot or driver is not planning to arrive at your destination any old time in the next week or two!

Sources

Elias, Norbert. Time: An Essay. Oxford, UK: Blackwell, 1992.
Marshack, Alexander. “Hierarchical Evolution of the Human Capacity: The Paleolithic Evidence.” James Arthur Lecture on the Evolution of the Human Brain, no. 54. New York: American Museum of Natural History, 1985.
Whitrow, G.J. Time in History: Views of Time from Prehistory to the Present Day. Oxford, UK: Oxford University Press, 1988.
Loading