neapolitan
The ice cream flavor neapolitan first begins appearing in 1868. Today, neapolitan ice cream almost universally implies vanilla, chocolate, and strawberry, but the three composing flavors were initially not standardized and would vary. One common trinity was pistachio, vanilla, and cherry, which resembled the newly-established (1861) Italian flag. And so the flavor came to be named after Naples, despite its actual origin in Prussia. Admittedly, quality ice cream did also have strong associations with Italy and its gelaterian legacy at the time.
As noted, neapolitan ice cream was actually invented by the royal chef of Prussia, Louis Ferdinand Jungius. In 1839, Louis published a book of experimental recipes he’d prepared for King Frederick William III of Hohenzollern at his Berlin estate. He described a Pückler as vertically layered ice cream flavored with strawberries, raspberries, greengages, cherries, and apricots. It was named after Prussian nobleman Hermann Pückler, a name we can all agree to be glad did not stick. By 1862, his suggested Pückler recipe instead had layers of apricots, quinces, strawberries, and raspberries. By 1903, the standard recipe called for the now familiar vanilla, strawberry, and chocolate flavors. They were still arranged like a layer cake, rather than our familiar horizontal sections. Why those three flavors? At the time, they were the three most popular ice cream flavors in the US, a fact that remains true today.
Why is the adjective form of Naples Neapolitan in the first place? Naples was founded in the 700s BCE as the Greek colony of Παρθενόπη (Parthenope). Two hundred years later, it was renamed Νεάπολις (Neápolis). By around 1000 CE, the systematic sound changes that distinguished Vulgar Latin from modern Italian had altered the city name to its contemporary Napoli. Meanwhile, different systematic sound changes over in France rendered the city name as Napples, then Naples by the 1400s, which is when the city name was first borrowed into English. So the English name of the city is directly borrowed from French, while its English adjective form is based on its Classical Greek name.
Parthenope is a Classical Greek name that’s a straightforward compound of παρθένος (parthénos), meaning “virgin”, and ὄψ (óps), meaning “voice”. In Greek mythology, one of the Sirens bears the name, specifically the daughter of muse of dance Terpsichore and river god Achelous. In 1850, the eleventh asteroid ever discovered, 11 Parthenope, was named after the mythological siren. This fit the pattern of the first ten asteroids, all named for women in Greek and Roman mythology. The first ten women were actually all specifically gods, so perhaps its discoverer, Annibale de Gasparis, chose to honor a demigod instead because he was an astronomer at the University of Naples.
Neápolis is interesting to me precisely because of how uninteresting it is. It’s an even more straightforward compound of νέα (néa), meaning “new”, and πόλις (pólis), meaning “city”. The major city’s name for the past two thousand five hundred years has been the maximally uninspired “New City”. This phenomenon is hardly unique, as New City and its linguistic equivalents is one of the most common city names in the world. It’s also an illustrative example of how place names seem to defy their original meaning. Just look at some of the English new cities and how strange it feels to do a surface analysis of their names: Newcastle, Newton, Newport, New Haven.
Most cities originally named New City get renamed if they become important and well-known, so I enjoy marveling at the ones that never did. Но́вгород (Novgorod) is Russian for Newtown. Villanova is Italian for New Village. 𐤒𐤓𐤕-𐤇𐤃𐤔𐤕 (qrt-ḥdšt), romanized as Carthage, is Phoenician for New City. เชียงใหม่ (Chiiang Mai) is Thai for New City.
I also learned that while Kota Bharu is Malay for New City and Nevşehir is Turkish for New City, despite their significance and population, I had never encountered either name before.
If you enjoyed this exploration, you may also enjoy my much briefer history of pizza (1931).
On truth
My mother is a pathological liar. As you might imagine, this made growing up really confusing and weird sometimes. As an adult, I love her, but I don’t trust a single thing she says. This is absolutely exhausting to deal with. So I learned to rely on evidence that things actually happened the way she claimed. And I learned that when evidence is hard to come by, the next best approach is getting multiple outside perspectives to make it more likely I’ll notice any contradictions.
I hope I’m pretty well-adjusted by now, but I definitely didn’t escape unscathed. My sister and I both independently adopted a different maladaptive pattern in early adulthood. We refused to lie. We both knew firsthand how much it sucked to be lied to and were determined never to inflict that on anyone else. As you might also imagine, adding this constraint to our stereotypically dramatic teenage social lives made them that much harder.
Some advice I’ve seen repeated is that telling the truth is freeing and you should do it more. My current understanding is that this is good advice because most people tell the truth something like 98% of the time. If that’s the case, telling the truth 99% of the time probably will make your life significantly better. But I can personally report that going from telling the truth 100% of the time to telling the truth 99% of the time makes life so much easier that I can’t even begin to compare it.
All of this is to say that, at first through circumstance, then later through affinity and profession, I highly value truth-seeking. As with many deeply held personal values, it took me a while to learn that not everyone shares this value. It took me longer still to learn that sometimes there’s a lot of wisdom in not saying things that are nonetheless true and relevant to the current situation.
I’ve been an engineer and a consultant. One factor being effective in both fields has in common is that important decisions should be based on ground truth. And ground truth is not straightforward to get. On the engineering side, you can see shades of this in successful companies’ push toward data-driven decision making in the early 2010s. Each profession involved in a business has its own accumulated knowledge — this particular copy improves conversions, sales are about relationships, don’t write incriminating things in emails — all based on prior experience, and often in conflict. Part of the reason behind the conflicts is that what’s actually true changes over time. That is, while the laws of physics are the same regardless of whether Aristotle or Galileo are inferring them, the laws of society change over time. Slavery is bad. Kids should stay in school. Women should be able to vote and own property. Especially in the last 200 years, what people know and how people act has been a moving target, and accumulated wisdom can easily become invalidated without anyone noticing. The best way I know to counteract this tendency is to measure thoughtfully and carefully, then trust and doubt your measurements in equal measure.
On the consulting side, the problem is even worse, because people have an vested interest in misrepresenting the truth. If you’re a manager, you can walk the floor and see for yourself what customers and employees are doing and saying. But most managers lean on their reports’ testimony. If you’re a director, that’s a lot harder. It’s much easier to just ask your managers what’s true and mentally average it out. And if you’re an executive, it’s harder still. All that you have to work with is multiple levels of direct reports selectively reporting just the parts of the truth that make them look good. One of the most effective things you can do as a management consultant is convince your clients that they’re not getting the full truth and it’s making their decisions worse, then suggest ways for them to get closer to it.
Galileo is legendary for recanting his heretical belief in heliocentrism under penalty of inquisition, only revealing his true belief on his deathbed. Obviously the Earth doesn’t move; anyone can see that. His last words were reportedly “e pur si muove”, typically translated as “and yet it moves”. As with George Washington and the cherry tree, which was supposedly only three hundred years ago, this was a legendary event. It’s probably not true. But it can nevertheless be a source of inspiration. Fiction is often better at this than fact. It certainly inspires me. When times seem tough and I can’t see a clear path forward, I remember: e pur si muove. And yet it moves. You can declare me a heretic and imprison me. And it may take a hundred years or more. But the truth will out.
A brief history of time units: Hours
One writing project I want to spend time on at Inkhaven is a guided journey through the origins of the time divisions of a day. I’ve had the section on hours written up for months but never posted it publicly. On day -2 of Inkhaven, I’m taking another small step toward being more fearless about writing in public.
If space aliens showed up and compared notes with us, there are some concepts we think we’d share regardless of where they came from. Iron would still be iron, prime numbers would still be prime numbers, and so on. And seconds, minutes, and hours would seem like a weird arbitrary system. I thought your species had 10 fingers, what’s up with all the 12s?
So what is up with all the 12s, anyway? I decided to dig into the history of timekeeping to try and find out. We’ll go through the units from broadest and earliest, to finest and most recent.
Hours
Photograph of an astronomical clock in Switzerland built perhaps around 1405. The clock only has an hour hand and Roman numerals from 1 to 12 twice going clockwise. Within the clock face is a smaller dial with the 12 zodiac animals and clockwork machinery in stark red, blue, and gold.
In the beginning, there was day and there was night. Influenced by 12 major constellations visible on the night of the annual flooding of the Nile, ancient Egyptians began dividing the night into 12 equal parts called wnwt around 2350 BCE. By around 1500 BCE, they’d extended the analogy to also divide the daylight hours into 12 different equal parts, measured with sundials.
So influenced, the classical Greeks also began dividing the day and night into two sets of 12 equal parts by around 150 BCE. Before that, the classical Greek day had 12 parts, but the night was divided into either 3 or 4 watches. While the word hour descends from classical Greek ὥρα (hṓrā), that word meant just “span of time” back then. The first clock that tracked hours across the full day-night cycle (νυχθήμερον, “nŭkhthḗmeron”) was constructed by Greek astronomer Andronicus Cyrrhestes around 50 BCE.
Of course, 24 hours were not the only top-level day divisions people came up with. Every major civilization had their own, from the 30 Vedic मुहूर्त (muhūrtaṃ) used from around 800 BCE to the 15 時 (shí) that divided up daylight hours in Han China by 139 BCE. Both Su Song’s 1094 CE water clock and Ismail al-Jazari’s 1206 CE programmable mechanical castle clock were the most accurate clocks in the world at time of construction. This particular history will focus on the ancestors and influences of the h:mm:ss system we use today.
Classical Greek and ancient Egyptian timekeeping both used unequal hours: one daylight hour wasn’t the same amount of time as one night hour. This is typical of timekeeping systems before the early modern era. In a world where lighting is smoky, expensive, and flickering, it’s more important to know how many hours you have until sunset, than it is to keep the length of an hour the same through the seasons. This only began to change when mechanical timekeeping became widespread. By 1400 CE, European cities began switching to “French” equinoctal hours so that they would only have to adjust the town clocks once a day, to correct drift, instead of twice a day, to change the length of the hour from day-hours to night-hours and back.
piano
The musical instrument piano was borrowed into English from French piano around 1790. The French word piano is a clipped form of French pianoforte, borrowed from the Italian short name of the instrument. The Italian word pianoforte is short for “un cimbalo di cipresso di piano e forte”, meaning “a keyboard of cypress with soft and loud”. It was invented and given its unwieldy name by Paduan instrument maker Bartolomeo Cristofori around 1700. The key innovation made by the piano over its harpsichord predecessors was that pianists could control how loud a note was by how hard they pressed the keys.
Piano design changed drastically during their first 200 years of existence. Modern pianos, featuring 88 keys spanning seven octaves, typically differ only incrementally from their codified 1890 designs. Earlier pianos now go by the retronym fortepiano and had ranges that steadily increased from their initial four octaves to the contemporary seven. Any classical music that includes a piano was probably composed after its popularization in the 1740s. Piano wire has been so called since 1806.
The Italian word piano meaning “soft”, but also “flat” or “level”, descends from Latin plānus, meaning “level”. Plānus is also the ancestor of English plan and plain, both borrowings from the same French word (plain) in different centuries and circumstances. Looking at sibling languages, the English word that would have been used before it borrowed plain was probably flats.
The Italian word forte (superlative form fortissimo) meaning “loud”, but also “strong”, descends from Latin fortis, meaning “strong” or “steadfast”. Fortis is also the ancestor of English fort and forte, both borrowings from the same French word (fort) in different centuries and circumstances. Looking at sibling languages, the English word that would have been used before it borrowed fort was probably fastness. Notably, the English word forte was originally two different words, one borrowed from French fort, meaning “strength” and originally pronounced as a single syllable, and the other borrowed from Italian forte, meaning “loud” and originally pronounced as two syllables.
Planned disruption of service
I’m delighted to be invited to Inkhaven, a writer’s retreat for bloggers. I’ll be writing onsite in Berkeley, CA for the month of April. They take an iron blogger approach to improving as a writer, so I’ve made a new blog category for Inkhaven posts. As such, I will either be posting daily, or be the first person ever kicked out of the retreat for not posting daily.
To borrow a phrase from the Recurse Center, my goal for the month is to become a dramatically better writer. I know I will necessarily struggle and flail on the way, and I hope you find the end result to be worth the temporary disruption.