A brief history of time units: Seconds

This is part three of a multi-part series on the origins of the time divisions of a day.

[Part 1: Hours] [Part 2: Minutes]

Seconds

In 1707, four British warships miscalculated their position and ran aground off the Isles of Scilly, killing 2,000 sailors in what is still one of the worst maritime disasters in British history. For lack of a better approach, the way ships calculated their longitude at that time was dead reckoning, basically guessing how far they’d gone from the last known port. The disaster led to the Longitude Act of 1714, establishing a £20,000 (over £4M today) bounty for the deployment of a reliable way to determine a ship’s longitude within 30 minutes (half of a degree).

This is relevant to our history because one theoretical approach to determining longitude was to precisely measure the sun’s position and compare it to solar noon at a known location, such as the Greenwich Royal Observatory. If you knew the exact time, you could calculate your exact longitude from that single observation. The problem with this approach was, no one knew any way to get the exact time at sea. Our marvelous pendulum clocks wouldn’t work on the open ocean, because of the motion of the waves.

The problem lit a fire in the imagination of English clockmaker John Harrison, who presented his first prototype for a marine chronometer to the Royal Academy in 1730. A lifetime of iteration and experimentation resulted in his fourth attempt finally winning the prize in 1761. In its maiden test, his H4 chronometer drifted just 5 seconds over the course of an 81-day voyage, enabling a longitude measurement accurate within 1.25 minutes.

As alluded to earlier, the word second comes from classical Latin secunda, in particular its presence in the Latin phrase pars minūta secunda, meaning “second very small part”. We could have called minutes “primes” or “firsts” instead to maintain internal consistency, but that is not what we did.

Scan of a diagram laying out the inner workings of the H4 chronometer. Several figures illustrating different cross-sections of the clock its mechanisms are labeled with letters. Scan of a diagram laying out the inner workings of the H4 chronometer. Several figures illustrating different cross-sections of the clock its mechanisms are labeled with letters.

The delicacy and expense involved in the construction of a device so precise made this solution barely affordable for even the richest country in the world. One chronometer cost one-third the price of an entire ship. Ordinary people wouldn’t have access to clocks with second accuracy until replaceable parts and manufacturing methods made them affordable in the mid-1800s.

The first implementation of the metric system, adopted by French revolutionaries in 1799, only covered base units of distance and mass (meters and kilograms), not time. Carl Gauss proposed using the second, to be officially defined as 1/86,400 of a day, as the standard metric unit of time in his millimeter/milligram/second measurement system in 1832. Over the next thirty years, his proposal found wide approval among scientists. The second was formally adopted as an official metric unit under the centimeter/gram/second measurement system, proposed in 1874 by a committee that included James Maxwell, Lord Kelvin, and James Joule.

Miniaturization followed mass production, resulting in the creation of pocket watches. Widely available second accuracy enabled new levels of coordination unviable just decades earlier. Military watches were first issued to officers in the 1880s. At the beginning of a battle, everyone would synchronize their watches to keep their maneuvers coordinated. In particular, it was important to agree on the exact second to expect an artillery strike. The phrase “synchronize your watches” is an artifact of the second era. You might not know the absolute time, but you could act like you did if you kept everyone’s relative times the same.

If you did in fact need to know the absolute time, observatories began broadcasting official time signals by telegraph in the 1850s. This wasn’t a service the average person would have access to. That begins with the 1924 appearance of radio time stations. Time hotlines that you could telephone for an accurate time signal started operating in 1933, gradually displacing the radio time stations.

Wristwatches, then called “bracelet watches”, were originally a women’s fashion accessory. They only lost their genderedness after military wristwatches entered common use around the world in the 1880s and 1890s. Newly verified as manly, men’s wristwatches became a standard fashion accessory in the 1900s. They remain the most popular men’s jewelry item to this day.

Clubs, really? That's what you call those?

What’s the deal with playing cards? From the perspective of someone learning about them for the first time, rather than something you’ve known all your life, they start to seem pretty weird. Clubs are clearly actually clovers, and spades look more like speartips. What’s a jack, and why is it on the same level as a king or queen? Earlier this year, I was surprised to learn that the last common ancestor of every tradition of playing cards was likely Chinese money cards in the 1300s.

In Xanadu did Kublai Khan the first national paper money decree. Chinese regional and national governments had tried to legitimize paper currency several times since Bi Sheng first invented movable type around 1044. The ability to cheaply print a different serial number on each banknote was the innovation that made this theoretically possible. Most of the earlier attempts to issue paper money had ended in either hyperinflation, or the entirely reasonable refusal to accept paper money over proper coinage due to hyperinflation. Kublai Khan’s 1260 decree finally resulted in the first time paper money was more widely circulated than coinage.

Like everything else, people found a way to gamble with paper money. But they also found it inconvenient and risky to play games with actual paper money. So some people printed up fake Monopoly money to gamble with instead. And some people came up with gambling games you could play with just the set of Monopoly money. The earliest one we have a record of is 馬弔 (mǎdiào), from the 1300s. A set of mǎdiào cards contains 38 cards in 4 suits:

  • Cash, representing hundred-wen coins, numbered from 1 to 9.
  • Strings of coins, representing thousands of 1-wen coins, numbered from 1 to 9.
  • Myriads, representing 10,000s of wen, numbered from 1 to 9.
  • Tens of myriads, representing 100,000s of wen, numbered from 2 to 9, plus three extra cards numbering 10, 100, and 1000.

Photo of eleven madiao cards in front of a torn page incorrectly identifying them as "Japanese playing cards". The cards are illustrated and printed in stark black on white. Photo of eleven madiao cards in front of a torn page incorrectly identifying them as “Japanese playing cards”. The cards are illustrated and printed in stark black on white.

This is clearly a direct representation of amounts of money. (Strings of coins, which had holes in the middle to facilitate this, were well known as the traditional way to carry large amounts of money.) You can find descendants of this groundbreaking deck in playing card traditions throughout Southeast Asia, as far off as Indonesia. The part that matters to our history of English playing cards is their spread west to the Kipchaks in central Asia. What later evolved into the Mamluk Egyptian tradition of playing cards featured the following suits:

  • Coins, as the cash cards were illustrated with coins.
  • Polo sticks, possibly the closest cultural referent to strings of coins?
  • Cups, resembling the Chinese character for myriad (万) when seen upside-down. The name of the suit is the Turkic word for “myriad”.
  • Swords

In the Mamluk deck, each suit contains 12 cards: ten of them numbered from 1 to 10, and face cards representing a king and a viceroy or deputy king. A third face card representing an under-deputy was added later.

By the late 1300s, we can tell that the Mamluk deck had spread to Eastern Europe and Islamic Spain by the presence of laws banning its use. Spanish playing cards descend from this tradition. They remain widely used throughout Central and South America. Spanish cards originally came in 52-card decks, but were later reduced to 48-card decks. They display the four suits of:

  • Oros (coins)
  • Bastos (clubs), the closest cultural referent to polo sticks
  • Copas (cups)
  • Espadas (swords)

The suits are numbered from 1 to 9, with the 10 later dropped for ease of printing. Each has three face cards, the rey (king), caballo (knight), and sota (page or squire). Notably, Spanish playing cards were the first variety to spread in England. When they were supplanted by French playing cards, the Spanish names for two of the suits stuck, clubs and espadas (spades).

By 1480, French playing cards had developed their own identity, influenced by the nearby German suits of leaves, hearts, bells, and acorns. The French suits drastically simplified the ornate illustrations found on German cards down to our familiar icons:

  • Carreaux (tiles), from leaves
  • Trèfles (clovers), from acorns
  • Cœurs (hearts)
  • Piques (pikes), from bells

The face cards here are our familiar king, queen, and knave (meaning “commoner”). In English, the knave was renamed the jack in 1864, as it was easier to distinguish K, Q, and J than K, Q, and Kn. Jack meant then what “guy” does now, a generic word for man.

Owing to their prevalence in the UK, US and Commonwealth, the English deck is widely known today. It is the ancestor of many newer playing card traditions. A notable exception is Japanese playing cards (hanafuda), also oddly not descended from the Chinese tradition, but instead based on Portugese playing cards in the late 1500s.

katsu

The culinary term katsu first appears in English, describing a style of curry, in 1976. Its standalone usage to mean just the cutlet begins much later, in 2006. It’s borrowed from Japanese カツ (katsu), meaning “cutlet”. カツ first appears in Japanese in 1928 as a clipping of カツレツ (katsuretsu), also meaning “cutlet”. カツレツ, in turn, first appears in Japanese in 1884, borrowed from English cutlet. Wait, what?!

The international expedition 岩倉使節団 (Iwakura Mission) was an 1871-1873 Japanese diplomatic voyage to Europe and the United States. Its primary goal, renegotiation of the unequal treaties forced on Japan at gunpoint, was not successful. Its secondary goal, learning more about industrialized societies to better enact Japan’s modernization, was wildly successful. One of those secondary results was the popularization of 洋食 (yōshoku), meaning “Western food”. トンカツ (tonkatsu), fried breaded pork cutlets, is regarded as one of the 三大洋食 (sandai yōshoku), meaning “three great Western foods”, dating to this period. The other two are curry rice and croquettes.

In 1852, US President Millard Fillmore assigned Navy Commodore Matthew Perry a mission to open trade with Japan by any means necessary. Since 1633, Japan operated under a policy of 鎖国 (sakoku), meaning “locked country”, closing its borders to expel foreign influence. In 1853, Perry sailed four steam-powered ironclads to the capital of Edo, firing blanks as a show of intimidation, while their iron hulls repelled all counterattacks. The Black Ships incident lives on in Japanese cultural memory as the paradigmatic out-of-distribution event: four invulnerable alien vessels arrive on the horizon, bearing unreasonable demands.

Katsu also has a Japanese homophone of much earlier origin, 喝. This sort of katsu is either a loud exclamation or a Zen Buddhist interjection for scolding. It’s akin to martial arts traditions’ 気合(kiai) shouted while performing an attack. 喝 is a borrowing from Chinese 喝 (hè, but pronounced xat in Middle Chinese), meaning “to shout”.

Photo of a katsu curry served at the Japanese equivalent of a diner. A sliced katsu cutlet, brown curry sauce, white rice, and bright red fukujinzuke are the prominent elements. Photo of a katsu curry served at the Japanese equivalent of a diner. A sliced katsu cutlet, brown curry sauce, white rice, and bright red fukujinzuke are the prominent elements.

Anyway! The culinary term cutlet first appears in English, describing a cut of meat, in 1706. It’s borrowed from French côtelette, meaning “cutlet”. Côtelette, earlier rendered in Middle French as costelette, first appears in French in 1393 as a dimunitive form of Old French coste, meaning “rib” or “side”. Coste is a direct descendant of Classical Latin costa, meaning “rib”, which is of unknown origin. Notably, costa doesn’t have any historical connection to the word or meaning “cut”, with contemporary origin theories including a word that means “bone”.

Folk etymology led to the English spelling cutlet, as the relationship with a cut of meat seems obvious. (A cut of meat is named for being a piece cut from a larger whole, and starts appearing with that meaning in 1591.) Thanks to French, -let is conveniently also an English dimunitive, so a cutlet could be presumed to be named after a little cut of meat.

In 1706, cutlet referred specifically to a cut of veal. The term gradually expanded to encompass pork in the 1800s and chicken in the 1900s. The modern breaded and fried preparation is likely based on the Austrian dish Wiener Schnitzel, with additional influences from Italian and French dishes. Wiener is German for “Viennese”, its city of origin. Schnitzel is German for “cutlet”, with the perfectly parallel construction of Schnitz “a cut-off piece” from schneiden “to cut” + -el dimunitive. While we also sometimes call hot dogs wieners (or later, weenies), that is from a clipping of “Viennese sausage”.

Left behind

I am not immune to propaganda. But I do seem to be more resistant to it than average. Recently, when (not if) the conversation turns to effective use of LLMs, I have been met by genuine surprise that I don’t use them yet. Aren’t I a technologist? Shouldn’t I be worried about getting left behind as a member of the perpetual underclass, or worse, an older adult who doesn’t understand phones?

I am not. This steadfast belief comes from the reason that I became a technologist in the first place. That’s right, it’s my dad.

My dad loved technology. He bought a portable computer in 1986. You would injure yourself if you tried to put that thing on your lap. My childhood memories include a background specter of my dad filming everything with his shoulder-mounted video recorder. The closest visual analogy I can draw on today would be an equally ancient boombox. In 2017, he insisted on buying a shiny new iMac despite my pleas to stick with a secure-by-design, normie-friendly Chromebook or iPad.

My dad was really bad with technology. That’s unfair, actually. He was a great mechanic and woodworker, doing all of our auto maintenance himself except for the really gnarly jobs. He taught me how our lawnmower worked, inside and out. What he was really bad with was computers. Something about the difference between mechanical devices that had one job composed of single-function parts in service of that job, and electronic devices that could do anything you asked them to, made the latter completely incomprehensible to him.

Well, somebody had to make our perpetually growing collection of computers behave. That task naturally fell to me as I got older and more capable. I grew up with an intuitive understanding of how to computer, matched only by an ironic helplessness around mechanical devices. You can just imagine my dad’s frustration at my inability to understand a simple oil change. This computer whispering wasn’t a chance outcome. My parents, immigrant factory workers both, were convinced that computers would be the future and steered me toward them. Before I was born, they committed to this plan to give their children a chance at a better life than their own.

Eliding many years of hard work and lucky breaks, that’s the story of how I came to be this odd duck of a technologist. I’m surrounded by early adopters and innovators who make my early majority inclinations feel like Luddism at times. But my upbringing imbued me with a deep-seated intuition around carefully weighing the pros and cons and waiting until a shiny new thing is more mature before buying in by default. And it imbues me with a useful perspective to have in strategy meetings. I wouldn’t even call it conservative. Just having a moderate voice in the room advocating for simplicity and proven, reliable solutions over what’s hot right now adds a lot of value.

And it seems to be working out fine? I didn’t get a cell phone until 2004. I got a music streaming account in 2021. I still don’t have a tablet or smart TV. And despite my constant anxieties about what people think of me, I don’t think anyone thinks of me as an out-of-touch old fogey. I get bombarded with the same fears about falling behind as everyone else, but I seem to have developed a natural resistance to them. Thanks, Dad. I miss you dearly.

Infinite Lives, 2010: Super Meat Boy

An exemplar of the precision platformer genre it popularized, Super Meat Boy simultaneously hates you and wants you to suffer, and believes in you and wants you to triumph. This shines through in the replay the game shows you at the end of each tiny level. The replay overlays every single attempt you made at the level. It might start with a hundred overlapping meat boys, then viscerally illustrate their winnowing at each deadly obstacle in turn, until only one remains triumphant at the end. It perfectly captures the rush of “I can’t believe how ridiculous that level was!” and “Holy crap I am so awesome!” at the same time. The game turns the single-minded struggle that you have just experienced into art.

This is the first in a series of posts examining video game history through looking at one game I loved from each year, 1978–2027. I’ll add a navigation box here when there is more than one post in the series.

Super Meat Boy was the first game from two-person indie studio Team Meat, although it was a sequel to designer Edmund McMillen’s free 2008 Flash game Meat Boy. McMillen, then 30, had made dozens of free Flash games since 2001. He would later go on to design The Binding of Isaac (2011) and Mewgenics (2026). The other founder of Team Meat, programmer Tommy Refenes, went on to make other games under the Super Meat Boy IP instead. The game was composer Danny Baranowsky’s breakout hit. Devil n’ Bass, which plays when you claw your way out of literal Hell only to arrive in even worse double secret hell, is still in my regular playlist rotation.

Screenshot of a single-screen level from Super Meat Boy. It is set in Hell and features a river of lava and too many moving sawblades. Screenshot of a single-screen level from Super Meat Boy. It is set in Hell and features a river of lava and too many moving sawblades.

Part of what makes Super Meat Boy so compelling is its approach to failure. On death, you immediately reappear at the beginning of the level and can try again right away. McMillen’s signature horrifying nightmare world combines smoothly with the ideal that failure isn’t a big deal. Failure being so common just serves to highlight just how unfair and brutal the world was. It would be another eight years before Celeste (2018) took that novel approach to its logical conclusion.

Another part of the appeal is the game’s focus on speed. Most (but crucially not all!) obstacles are timed so that an all-out sprint can barely clear them. You also get a prominent speed ranking at the end of each level. Even harder versions of the levels are gated behind finishing most of them with a certified Grade A. There are always more tantalizing challenges on offer if you want them. After many of my initial hard-won victories, I found myself immediately restarting the level to do it again, but better. Faster. Cleaner.

Super Meat Boy was originally released on XBox Live Arcade, an apt reminder that 2010 was still early in the indie game renaissance. Its acronym, SMB, was selected as a homage to Super Mario Brothers (1985), the game that first defined the platformer. Super Meat Boy wears its classic video game influences on its sleeve, including warp zones that lead to levels demade for fantasy retro consoles and glitch art that evokes NES and Game Boy memory corruption. The game is also in conversation with its fellow indie precision platformers like VVVVVV (2010) and I Wanna Be The Guy (2007), even featuring guest characters from them.

2010 also saw the release of early mobile hits Angry Birds, Cut the Rope, and Fruit Ninja. The app store pricing race to the bottom was well underway. All three games were initially sold as standalone products with no in-app purchases for $1, which they did in fact make up for in volume.