This is part four of a multi-part series on the origins of the time divisions of a day.

[Part 1: Hours] [Part 2: Minutes] [Part 3: Seconds]

Tenths

If we extend the pattern we see with hours, minutes, and seconds, then seconds should be subdivided into 60 thirds. Instead, we subdivide them into 1000 milliseconds. Imagine what a world where we used thirds would be like. Movies would run at 0.4 fpt (frames per third), fluorescent lights would flicker at 1.67 or 2 fpt, and games would strive for 1 fpt.

In the world we live in, decimalization and metrication became symbols of reason and revolution in 1790s France and America. Decimal time was officially enacted in France in 1793. Each day would be divided into 10 hours. Each hour would be subdivided into 100 minutes, and each minute would be subdivided into 100 seconds. The resulting 100,000 seconds per day is close enough to our 86,400 seconds per day that decimal seconds feel surprisingly reasonable. Decimal hours and minutes did not. The revolutionary regime gave up trying to enforce the unpopular change in 1795. (Swatch would resurrect decimal time 200 years later as Swatch Internet Time. This was also not widely adopted.)

Unlike decimal seconds, decimal meters and decimal grams (and decimal cents!) did successfully spread. By the mid-1800s, there was a clear consensus that any subdivision of seconds should be decimal, like the other metric units. Tenths was a long established English word for fractions. English ten (and its inverse tenth) shares an origin with German zehn or Dutch tien, rather than French dix, Spanish diez, or Latin decem. Preserving the original term was not a foregone conclusion. For example, English generally prefers the Latin word percent (short for per centum, meaning “by the hundred”) and derived terms percentage and percentile over its native hundredth (German hundert, Dutch honderd).

Painting of a horse race titled "The 1821 Derby at Epsom", by Théodore Géricault. Four male jockeys ride racing horses across the English countryside against a backdrop of dark clouds. The horses have all of their legs outstreched off the ground in the "flying gallop" pose. Painting of a horse race titled “The 1821 Derby at Epsom”, by Théodore Géricault. Four male jockeys ride racing horses across the English countryside against a backdrop of dark clouds. The horses have all of their legs outstreched off the ground in the “flying gallop” pose.

California tycoon Leland Stanford (who founded Stanford in 1885) commissioned an 1878 photo series called The Horse in Motion. Automatic electrophotographs captured how horses actually moved, revealing errors in human observation. The commonly depicted “flying gallop” pose, airborne with all four legs outstretched, was never actually a part of a running horse’s gait. The indifferent machine proved that what we thought we saw with our eyes was wrong. The photos were an international sensation, warranting a cover story in Scientific American.

In 1823, astronomer Friedrich Bessel observed consistent errors while recording very precise event timings. He created the concept of the “personal equation” to explain the errors: different people had different, but predictable, reaction times. There’s a lot of individual variation, but it takes most people 0.2 to 0.3 seconds to notice a stimulus and act in response. This raised a philosophical dilemma. What was the point of being human if impartial machines were better at seeing than we were?

The shift to impressionist and pointillist styles of painting illustrated a similar reaction to the spread of photography. If a camera could capture the reality of a landscape or stern expression more perfectly than any human could hope to, why compete directly? Why not do something else that machines couldn’t do instead?

Official times at the 1896 Olympics were kept by individual referees with stopwatches. This was barely enough precision to keep records to the tenth of a second. People tried many different things in the following decades to achieve hundredths of a second-level Olympic record accuracy, but kept running up against the limits of human reaction time. Any precision beyond tenths would require that humans be removed from the loop entirely.