Measure for Measure Part I: Time and the Second

The first installment of a blog delving into the origin of different units of measurement.

Upon tackling units of measurement it seems clear to me that the only possible place to start is with time and the second. Throughout the 21st century many fundamental units have become entangled with time bonding the second to length, mass, current, temperature, and luminous intensity. Second to none, it’s the most important unit although it’s fair to say its definition and history isn’t common knowledge. Evolving over millennia, it’s been a tricky unit to pin down as time can’t been seen nor touched. The only way humans can measure the passage of time is by observing a constant physical process and stating something along the lines of “That waterfall has filled up my barrel, it’s been 2 minutes mate”. So I guess the question is, what is a second? And how did it even come about when for most of human history it couldn’t be measured.

We first turn the clocks back to around 3500 B.C and the Sumerians. They found that 60 is quite a mathematically attractive number. Divisible by 2, 3, 4, 5, 6, 10, 12, 15, and 20 it makes for easy calculations — especially when you don’t have a calculator to hand. The Babylonians later adopted this Sexagesimal system beginning its widespread use; however, the fascination with 60 doesn’t end in Mesopotamia. Many regions across Asia also utilised base-60 and base-12 (duodecimal) mathematical systems. A reason for this may be that using your thumb as a pointer and the three segments on your four fingers, you can keep track of numbers up to 12. Throw your other hand into the mix and you can keep track all the way up to 60. At this point it’s worth mentioning this all happened a long time ago so It’s hard to accurately gauge exactly how these systems came about — but this is our best understanding so far.

Linking the duodecimal and sexagesimal systems back to time brings us to the Ancient Egyptians. They’re widely credited with being the first civilisation to use sundials, enabling a day to be divided into shorter periods of time. Circa 1500 B.C and the Egyptians are calibrating their sundials to divide the day up into 12 portions. These rudimentary hours can’t really be classed as a standard unit of time as their length varied throughout the year with summer hours longer than winter hours. Though useful through the day, night-time poses certain challenges i.e., the lack of sunlight. Consequently, astronomers began using the stars and started tracking 36 constellations spread evenly across the sky, 18 visible each night and 18 obscured by daylight. 18 constellations rose at 45 minutes intervals throughout the night and when a new one appeared after being previously obscured by daytime (helical rising), it would mark the start of a new 10-day period. This led to them being known as decans and is also where the Egyptians got their 360-day year from. Twilight hours either side of the night made it difficult to see about six of these decans and later in the New Kingdom they were removed, with the remaining 12 being used to formally declare night. Fast forward 1000 years to the Hellenistic period and Hipparchus, and he suggests that hours should be defined based on observations taken on equinox days, resulting in 24 equal hours.

Now we have a working definition of an hour we can start to understand the minute. At the same time as his work on defining an hour, Hipparchus was working on a better way to determine where someone was on Earth. He improved on Eratosthenes ideas on latitude and introduced longitude as a way to specify your east-west position in regard to Alexandria. One hundred years later and Ptolemy is on the scene and further improving this geographic system. In the Almagest he begins dividing the degree into 60 smaller parts called “partes minutae primae” or “first small parts”, with minuate, meaning small, becoming the origin of the word minute. This wasn’t precise enough though and he decided to go one step further and divide the minute up into “partes minutae secundae” or “second small parts”, with secundae becoming the origin of the word second. It won’t be for another thousand years that these smaller sub-divisions of minutes and seconds are used to describe the passage of time.

Al-Biruni was a scholar circa 1000 A.D. in Ghazni which lies in modern day Afghanistan. Aware of Ptolemy’s work he often improved upon it, applying a more rigorous scientific technique to his empirical observations. He started using lunar cycles to track the passing of time, defining days and hours with it. He then went one step further and applied Ptolemy’s concept of minutes and seconds to the hour becoming the first person to define minutes and seconds with respect to time — the birth of our modern-day system.

In practice this system wasn’t very useful as people at the time didn’t really have a practical way to consistently measure equal hours, let alone minutes and seconds. Water clocks were used widely around the world and in 300 B.C. Archimedes created the first known geared clock; though it still did need water to run. A variety of these mechanical water clocks were created but they would have to be reset frequently by refilling the water. Fully mechanical clocks are thought to have been around since the late 1200s with several English churches having them installed. From this point clocks continued to become more accurate and reliable providing a standard measure of an hour, mainly so people could pray at the correct times. In the late 1400s minute hands became present on a number of clock faces with second hands creeping in throughout the 1500s, though it must be said they were rare and inaccurate.

Seconds were only able to be tracked somewhat accurately when Galileo first conceptualised the pendulum clock in the 1640s. Though he never managed to finish it, Huygens stepped up to the plate and realised the clock, patenting the technology, and reducing error from 15 minutes a day to 15 seconds. A standard length of pendulum began to crop up at 0.994 m, which results in a swing and its return taking two seconds enabling widespread definitions for a consistent second. So called seconds pendulums were often encased in wood and glass, becoming known as grandfather clocks.

Scientific compensation clock

A scientific compensation clock at Bristol Museum. Mercury contained within the clock helps “compensate” for the expansion and contraction of the pendulum with temperature change.

Until the 20th century, pendulum clocks were the standard for time keeping. Though inventions like the hairspring made smaller clocks such as pocket watches feasible, the focus was still on increasing the accuracy of timekeeping. By the mid-18th century pendulum clocks were keeping time accurate to a few seconds each week. Consequently, standard unit systems were being discussed in scientific communities and in 1862 the British Association for the Advancement of Science (BAAS) stated, “All men of science are agreed to use the second of mean solar time as the unit of time”. Using the top clocks of the day, the second was defined to be 1/86400 of a mean solar day associating our definition of time with how fast the Earth spins.

This was a great definition for the time but in the 1920s the first quartz clock was constructed. Quartz has piezoelectric properties meaning that when a voltage source is applied it changes shape. A piece of quartz, with the same shape, size, and voltage applied will oscillate at the same rate — the fundamental frequency. As a consequence you can have clocks, even wristwatches, that can keep time accurate to one second per year. With time keeping this accurate scientists discovered that Earth’s rotation isn’t actually constant. In order to more accurately define a second, a more stable physical phenomenon is needed. We can find this in Earth’s orbit as although it’s drifting from the sun, the rate at which it happens can be accounted for. Therefore, in 1956 the second was redefined to be 1⁄31,556,925.9747 of a tropical year, specifically 1900. It was later adopted and used within the international system of units in 1960. It was short-lived.

At the same time, atomic physics was progressing at a break-neck rate. Much time and effort was spent on the construction of atomic clocks with the first being finished in 1955 by Louis Essen. Prior to these, many clocks based on electromagnetic phenomena where conceived but none as reliable as manipulating hyperfine transitions in a caesium 133 atom. All this really means is that two energy levels within the caesium atom are extremely close together and, when excited via electromagnetic radiation, will switch energy levels. After a very short period of time, the atom will re-emit the energy and return to its previous level. It just so happens that this process is incredibly stable and we can measure it well. Since 1967, the official definition of the second has been 9,192,631,770 of these hyperfine transitions and is measured so precisely, modern day atomic clocks are accurate to 1 second in 20 million years. It’s been described “the most accurate realization of a unit that mankind has yet achieved” and is now used as the foundation for six out of the seven base SI units. So I guess I’ll be banging on about caesium atoms for at least the next five articles.

Measure for Measure