On truth

My mother is a pathological liar. As you might imagine, this made growing up really confusing and weird sometimes. As an adult, I love her, but I don’t trust a single thing she says. This is absolutely exhausting to deal with. So I learned to rely on evidence that things actually happened the way she claimed. And I learned that when evidence is hard to come by, the next best approach is getting multiple outside perspectives to make it more likely I’ll notice any contradictions.

I hope I’m pretty well-adjusted by now, but I definitely didn’t escape unscathed. My sister and I both independently adopted a different maladaptive pattern in early adulthood. We refused to lie. We both knew firsthand how much it sucked to be lied to and were determined never to inflict that on anyone else. As you might also imagine, adding this constraint to our stereotypically dramatic teenage social lives made them that much harder.

Some advice I’ve seen repeated is that telling the truth is freeing and you should do it more. My current understanding is that this is good advice because most people tell the truth something like 98% of the time. If that’s the case, telling the truth 99% of the time probably will make your life significantly better. But I can personally report that going from telling the truth 100% of the time to telling the truth 99% of the time makes life so much easier that I can’t even begin to compare it.

All of this is to say that, at first through circumstance, then later through affinity and profession, I highly value truth-seeking. As with many deeply held personal values, it took me a while to learn that not everyone shares this value. It took me longer still to learn that sometimes there’s a lot of wisdom in not saying things that are nonetheless true and relevant to the current situation.

I’ve been an engineer and a consultant. One factor being effective in both fields has in common is that important decisions should be based on ground truth. And ground truth is not straightforward to get. On the engineering side, you can see shades of this in successful companies’ push toward data-driven decision making in the early 2010s. Each profession involved in a business has its own accumulated knowledge — this particular copy improves conversions, sales are about relationships, don’t write incriminating things in emails — all based on prior experience, and often in conflict. Part of the reason behind the conflicts is that what’s actually true changes over time. That is, while the laws of physics are the same regardless of whether Aristotle or Galileo are inferring them, the laws of society change over time. Slavery is bad. Kids should stay in school. Women should be able to vote and own property. Especially in the last 200 years, what people know and how people act has been a moving target, and accumulated wisdom can easily become invalidated without anyone noticing. The best way I know to counteract this tendency is to measure thoughtfully and carefully, then trust and doubt your measurements in equal measure.

On the consulting side, the problem is even worse, because people have an vested interest in misrepresenting the truth. If you’re a manager, you can walk the floor and see for yourself what customers and employees are doing and saying. But most managers lean on their reports’ testimony. If you’re a director, that’s a lot harder. It’s much easier to just ask your managers what’s true and mentally average it out. And if you’re an executive, it’s harder still. All that you have to work with is multiple levels of direct reports selectively reporting just the parts of the truth that make them look good. One of the most effective things you can do as a management consultant is convince your clients that they’re not getting the full truth and it’s making their decisions worse, then suggest ways for them to get closer to it.

Galileo is legendary for recanting his heretical belief in heliocentrism under penalty of inquisition, only revealing his true belief on his deathbed. Obviously the Earth doesn’t move; anyone can see that. His legendary last words were “e pur si muove”, typically translated as “and yet it moves”. As with George Washington and the cherry tree, which was supposedly only three hundred years ago, this was a legendary event. It’s probably not true. But it can nevertheless be a source of inspiration. Fiction is often better at this than fact. It certainly inspires me. When times seem tough and I can’t see a clear path forward, I remember: e pur si muove. And yet it moves. You can declare me a heretic and imprison me. And it may take a hundred years or more. But the truth will out.

A brief history of time units: Hours

One writing project I want to spend time on at Inkhaven is a guided journey through the origins of the time divisions of a day. I’ve had the section on hours written up for months but never posted it publicly. On day -2 of Inkhaven, I’m taking another small step toward being more fearless about writing in public.


If space aliens showed up and compared notes with us, there are some concepts we think we’d share regardless of where they came from. Iron would still be iron, prime numbers would still be prime numbers, and so on. And seconds, minutes, and hours would seem like a weird arbitrary system. I thought your species had 10 fingers, what’s up with all the 12s?

So what is up with all the 12s, anyway? I decided to dig into the history of timekeeping to try and find out. We’ll go through the units from broadest and earliest, to finest and most recent.

Hours

Photograph of an astronomical clock in Switzerland built perhaps around 1405. The clock only has an hour hand and Roman numerals from 1 to 12 twice going clockwise. Within the clock face is a smaller dial with the 12 zodiac animals and clockwork machinery in stark red, blue, and gold. Photograph of an astronomical clock in Switzerland built perhaps around 1405. The clock only has an hour hand and Roman numerals from 1 to 12 twice going clockwise. Within the clock face is a smaller dial with the 12 zodiac animals and clockwork machinery in stark red, blue, and gold.

In the beginning, there was day and there was night. Influenced by 12 major constellations visible on the night of the annual flooding of the Nile, ancient Egyptians began dividing the night into 12 equal parts called wnwt around 2350 BCE. By around 1500 BCE, they’d extended the analogy to also divide the daylight hours into 12 different equal parts, measured with sundials.

So influenced, the classical Greeks also began dividing the day and night into two sets of 12 equal parts by around 150 BCE. Before that, the classical Greek day had 12 parts, but the night was divided into either 3 or 4 watches. While the word hour descends from classical Greek ὥρα (hṓrā), that word meant just “span of time” back then. The first clock that tracked hours across the full day-night cycle (νυχθήμερον, “nŭkhthḗmeron”) was constructed by Greek astronomer Andronicus Cyrrhestes around 50 BCE.

Of course, 24 hours were not the only top-level day divisions people came up with. Every major civilization had their own, from the 30 Vedic मुहूर्त (muhūrtaṃ) used from around 800 BCE to the 15 時 (shí) that divided up daylight hours in Han China by 139 BCE. Both Su Song’s 1094 CE water clock and Ismail al-Jazari’s 1206 CE programmable mechanical castle clock were the most accurate clocks in the world at time of construction. This particular history will focus on the ancestors and influences of the h:mm:ss system we use today.

Classical Greek and ancient Egyptian timekeeping both used unequal hours: one daylight hour wasn’t the same amount of time as one night hour. This is typical of timekeeping systems before the early modern era. In a world where lighting is smoky, expensive, and flickering, it’s more important to know how many hours you have until sunset, than it is to keep the length of an hour the same through the seasons. This only began to change when mechanical timekeeping became widespread. By 1400 CE, European cities began switching to “French” equinoctal hours so that they would only have to adjust the town clocks once a day, to correct drift, instead of twice a day, to change the length of the hour from day-hours to night-hours and back.

piano

The musical instrument piano was borrowed into English from French piano around 1790. The French word piano is a clipped form of French pianoforte, borrowed from the Italian short name of the instrument. The Italian word pianoforte is short for “un cimbalo di cipresso di piano e forte”, meaning “a keyboard of cypress with soft and loud”. It was invented and given its unwieldy name by Paduan instrument maker Bartolomeo Cristofori around 1700. The key innovation made by the piano over its harpsichord predecessors was that pianists could control how loud a note was by how hard they pressed the keys.

Piano design changed drastically during their first 200 years of existence. Modern pianos, featuring 88 keys spanning seven octaves, typically differ only incrementally from their codified 1890 designs. Earlier pianos now go by the retronym fortepiano and had ranges that steadily increased from their initial four octaves to the contemporary seven. Any classical music that includes a piano was probably composed after its popularization in the 1740s. Piano wire has been so called since 1806.

The Italian word piano meaning “soft”, but also “flat” or “level”, descends from Latin plānus, meaning “level”. Plānus is also the ancestor of English plan and plain, both borrowings from the same French word (plain) in different centuries and circumstances. Looking at sibling languages, the English word that would have been used before it borrowed plain was probably flats.

The Italian word forte (superlative form fortissimo) meaning “loud”, but also “strong”, descends from Latin fortis, meaning “strong” or “steadfast”. Fortis is also the ancestor of English fort and forte, both borrowings from the same French word (fort) in different centuries and circumstances. Looking at sibling languages, the English word that would have been used before it borrowed fort was probably fastness. Notably, the English word forte was originally two different words, one borrowed from French fort, meaning “strength” and originally pronounced as a single syllable, and the other borrowed from Italian forte, meaning “loud” and originally pronounced as two syllables.

Planned disruption of service

I’m delighted to be invited to Inkhaven, a writer’s retreat for bloggers. I’ll be writing onsite in Berkeley, CA for the month of April. They take an iron blogger approach to improving as a writer, so I’ve made a new blog category for Inkhaven posts. As such, I will either be posting daily, or be the first person ever kicked out of the retreat for not posting daily.

To borrow a phrase from the Recurse Center, my goal for the month is to become a dramatically better writer. I know I will necessarily struggle and flail on the way, and I hope you find the end result to be worth the temporary disruption.

yolo

The interjection yolo was popularized by its use in Canadian rapper Drake’s 2011 single The Motto. The titular motto was YOLO, pronounced as a word. The acronym YOLO standing for “you only live once” can be sporadically found as early as 1993, as the longhand phrase “you only live once” has been in the language for over 100 years.

Yolo’s popularity peaked in 2012, when it was beaten out at word of the year awards by words like gif and hashtag. The OED began including yolo as an English word worthy of documentation in 2016.