Misaligned, uncontained slop
Rationalists often speculate about what might happen if a misaligned AI we created escapes containment. I think this has already happened. I claim the word “slop” is a misaligned AI we created that has escaped containment.
Richard Dawkins coined the term meme in 1976 to describe the unit of cultural evolution, in the same way genes are the unit of biological evolution. Every word is a raw unit of meaning people created to more efficiently lob compact meaning bombs at each other. People transmit, reproduce, and mutate them. Words are memes. Each word competes with other words for fitness and “wants” to perpetuate itself, in the same way we think of a gene as “wanting” to perpetuate itself despite its lack of agency.
So anthropomorphized, you can see that the word “slop” solves a variety of general problems. Its broad applicability contributes to its spread, and therefore fitness. And it learns from its environment and past actions — the way people use and spread the word “slop” depends on their cultural context and history. You wouldn’t casually sling the word “slop” around in a formal policy recommendation, or on a blog post you wrote three years ago. You might deploy it more readily today, now that it’s been awarded the prestigious Word of the Year 2025 award by both Merriam-Webster and the American Dialect Society.
If we think of the word “slop” as an AI, it’s clear that it is both misaligned and has escaped containment. The word “slop” was coined to describe unwanted, low-quality LLM output, a parallel to the word “spam” describing unwanted, low-quality emails. Widespread compounds like friendslop and slop bowl illustrate that the word’s meaning has drifted far indeed from its initial intention.
Photo of a Chipotle burrito bowl. The single-use, elliptical bowl contains chicken al pastor, corn, cheese, sour cream, lettuce, black beans, and cilantro lime white rice.
Admittedly, categorizing the word “slop” as AI implies we should categorize all words as AIs, which doesn’t feel right. Let’s try a weaker version instead and see if that lands better. I claim the United States of America is a misaligned AGI we created that has escaped containment.
Yes, AGI. Even this devil’s advocate can’t find a framing that attributes human-level intelligence to the word “slop”. On the other hand, the US is at least as intelligent as a human, as well as being capable of nearly all common intellectual tasks.
In 2017, Charles Stross gave a keynote at the Chaos Communication Congress describing corporations as really old, really slow AIs. So we can start predicting what misaligned AGIs might do by looking at what misaligned corporations have already done. The argument goes, corporations are artificial: people created them. Corporations are the result of a series of inventions over time that gradually increased their capabilities. Corporations are intelligent: they are self-aware, learn from their actions and environment, and develop complex social relationships with other corporations. And corporations are agentic. Microsoft doesn’t do what Bill Gates wants. Or Steve Ballmer, or Satya Nadella, or any specific person. No person can direct all of Microsoft’s attention. Microsoft is composed of departments and people, just as people are composed of organs and cells. But Microsoft is something more than its constituent people, just as you are something more than your constituent cells. Microsoft’s will affects the world every day, in more ways than any one person can comprehend.
What Microsoft does not have is an army and a navy. The United States has those things. Like Microsoft, it is artificial, intelligent, and agentic. We the people created the US. Its existence depends on concepts that did not exist just 400 years ago. The US is aware of its own existence, learns from its actions, and has complex foreign relations with other nations. It has an immune system that isolates and kills cancerous people before they grow and hurt their surrounding people. Its will is different from and greater than any one person’s. Not even Donald Trump can direct all of the US’s attention. The US takes actions guided by a defined set of amendable principles, and it may later regret and apologize for its past actions.
It’s straightforward to show that the US is misaligned and has escaped containment. Even early in its history, the US displayed a strong desire to grow unchecked and to make more of itself. US ideals and culture outcompeted many others, spreading all over the world. Nearly half the world’s population now lives under a democracy. We can also point to actions the US takes that seem to violate its guiding principles. A cop in isolation can’t incarcerate a dissident; the US does. A sniper in isolation can’t assassinate Ali Khamenei; the US did.
Yet despite everything, I am still proud to be an American. Today I turn my own attention to shifting the US’s attention one tiny bit. I do this through the time-honored American tradition of indiscriminately lobbing compact meaning bombs everywhere.
Misaligned, uncontained AGIs are all around us. This has been true for over a hundred years. But humans are still alive and well. What are some of the things we did and still do to make this outcome, and good outcomes in general, more likely?
Effective pre-journaling
I went on a meditation retreat with Jhourney last year that I can fairly describe as life-changing. They take a secular, evidence-based approach to meditation that some practitioners will find distasteful and others with see as a perfect fit. In particular, they use a lot of tools for thought that I recognize from consulting. Although one in particular I found very useful doesn’t seem to originate from either consulting or psychology. I wonder if it’s something the team came up with themselves.
They call it the PLAN framework for journaling. I don’t have the text handy, so I’m not going to get it exactly right. But maybe my unintentional changes will make it more memetically transmissible.
If you have some time before doing something important, write some notes to organize your thoughts first. Use the prompts:
Purpose. What is my goal? What am I trying to accomplish and why? Are there different, easier ways to get what I actually want?
Learned. What do I know that’s related to the goal? What happened the last time I tried to do something like this? What happened when someone else tried something like this?
Action. What specific actions to I plan to take to achieve my goal? Write out the details and see if anything stands out.
Needs. What is some thing or knowledge that, if I had it, would make doing these things dramatically easier? Can I figure out how to get that?
Scan of an 1860 diagram of the parts of the brain, overlaid on a man in profile. Some parts have labels like Manners, Patriotism, Industry, and Aversion.
Moving from abstract to concrete and back helps with learning, so here’s an example:
Purpose. I want to figure out what to write for tomorrow’s post. There’s a lot of ideas in my list. I want to pick one so that I can get some thinking and writing in ahead of time.
Learned. History posts seem very compelling to me while I’m writing daily. Memoir posts seem to get a lot of traction, but the required vulnerability leaves me drained. I enjoy review posts, but feel a lot of pressure to get the details right. List posts are quick and easy on an off day. I should set an intention ahead of time if I want to try a kind of post that scares me.
Action. Go through the ideas list from most to least recent. My best ideas so far seem to happen when I can write a single post that hits two or more of them. It’s a fun puzzle to solve that sometimes hits me with related inspiration. Matching a content idea to a form restriction works well, but finding a throughline between two content ideas is even better.
Needs. What’s my schedule today and tomorrow? Do I have any plans that align with a particular topic or theme? Is there someone I can chat with who’d really inspire or inform my work on a subject? Who do I want to spend time chatting with that I haven’t had the opportunity to yet?
In my experience, writing these out when I’m faced with a thorny problem has been well worth the time spent. Each of the four prompts has led me to important realizations. In particular, the “dramatically easier” framing in Needs often helps me notice that I should do or ask something else first. That someting else might lead me to never actually take the actions, but get what I want anyway. Unfortunately, skipping directly to just that prompt without going through the other ones doesn’t seem to work as well.
One hint that makes me suspect the Jhourney team created this themselves is, the mnemonic absolutely does not work for me. I can recall it now through repetition. But the first few times, I had to look up the words every time despite knowing their first letters. Needs, really? But I can’t deny the effectiveness of the questions. If you can come up with a better mnemonic, I’d love to hear it.
Infinite Lives, 2000: Diablo II
Gradually build up a character from plinking away one shot at a time, to filling the screen with projectiles while blinking all over the map. Diablo II is all about experiencing an incredible 100-hour power fantasy character arc. Like with Lord of the Rings, playing it now can feel derivative rather than prescient, just because the series established so many of the conventions for video games. Red health and blue mana? Item sets? Teleporting to waypoints? Color-coded item rarities? Gems and socketable items? Item affixes? Skill trees??
This is the second in a series of posts examining video game history through looking at one game I loved from each year, 1978–2027.
One secret to Diablo II’s success was its obsession with procedural generation. Instead of static levels you could memorize, most Diablo II levels were randomly generated. While generated levels were a staple of the roguelike genre Diablo II descended from, they weren’t common in mainstream games. Diablo II’s world generation was notable in part for using many different generators based on your biome. Wandering the randomly-generated desert would feel very different from navigating abandoned sewers. While its predecessor Diablo (1996) was also procedurally generated, it contained a single 16-level dungeon, a pale shadow of Diablo II’s expansive generated world.
Diablo II inherited its rich, underexplored mechanics from Diablo, of course. Diablo was an attempt to take the roguelike Moria (1983) and add graphical, real-time action combat. Considered the first influential action RPG (ARPG), Diablo was a solid success. Its strategy of “take a niche genre that turbo-nerds love and make it pretty and legible to regular nerds” created an excellent foundation for Diablo II to expand on. Diablo II would then go on to inspire so many ARPGs that the genre is still sometimes called “diablolike” to this day.
Diablo-style item colors — blue for magic, yellow for rare, green for set, and gold for unique — would become the default for video games until its sibling World of WarCraft (2004) popularized its even more influential standard. ARPGs were such a compelling fusion that they still inspire entire new genres. Borderlands (2009) applied the formula to first-person shooters, creating looter-shooters. Vampire Survivors (2022) compacted the 100-hour character arc to 30 minutes, creating bullet heavens (or single-stick shooters, or pick-3s; the genre name hasn’t cleared its orbit yet).
Screenshot of Diablo 2 in pixelated 640x480 resolution. A sorceress in trademark green casts Blizzard, spraying snow all over the dungeon and turning a nearby sword demon blue.
The Arreat Summit was an official Diablo II website where you could look up game mechanics in unprecedented detail. Official game websites in 2000 would have a screenshot or two, some marketing copy, and maybe a link to a fansite where the real information was if you were lucky. Allocating a website maintainer to publicly document the game’s inner workings was another thing Diablo II did far ahead of its time. Another was the free included netplay service battle.net.
I did not play Diablo II in 2000. Its system requirements included a ridiculous 32MiB of RAM. When they ran an open-to-everyone load test, I gave it a shot with my 16MiB of RAM and was treated to a slideshow. I could only look on in envy, only sneaking in a few hours on a friend’s gaming rig in 2001. I eventually learned how to physically jam some RAM I ordered straight from the manufacturer into my computer’s innards by 2002 and was able to enjoy it.
It would be negligent to discuss 2000 in video games without at least mentioning The Sims. The Sims was by far the most well-known game released that year, especially popular among people who didn’t play video games. Starting with SimCity (1989), Maxis set out to make games out of simulating everything from a single building in SimTower (1994) to the entire planet in SimEarth (1990). They’d known for years that simulated people would be the most difficult and rewarding game to get right. They managed to knock it out of the park, with innovations like the needs bars and mood framework that changed the way people thought. They even managed to include same-sex relationships, despite an unfriendly media climate.
Solar Revolution
Nearly 250 years on, the French Revolution’s legacy continues to underpin American culture. Liberty, freedom, and brotherhood! No kings! Actually, let’s kill all of the scientists! Men are born and remain free and equal in rights! The Statue of Liberty! The Louisiana Purchase! Optical telegraphs? Decimal currency: centimes and cents! Decimal units: meters and kilograms! Decimal time??
And while we’re remaking everything in the image of Reason, let’s throw out this ridiculous Gregorian calendar. Seriously, the 9th, 10th, 11th, and 12th months have been named 7 (septem), 8 (octō), 9 (novem), and 10 (decem) for thousands of years and no one has tried to fix it?! We can do better.
And so they did. The French Republican calendar features 12 months of 30 days each, plus five national holidays belonging to no month bringing the day count up to 365. Each month is divided into three décades of 10 days, with 9 days for work and 1 day for rest. (A reminder that the 2-day weekend was a more recent invention.) And the most exciting part: they got to name all the months, days of the décade, and days of the year from scratch.
Instead of venerating a tired Catholic saint, every day of the year would be named for a rural animal, vegetable, or mineral. The new month names would be neologisms themed on the weather in Paris, with rhyming names indicating each season. Beginning with late September-early August, the first month of autumn, they are: Vendémiaire, Brumaire, and Frimaire; Nivôse, Pluviôse, and Ventôse; Germinal, Floréal, and Prairial; and Messidor, Thermidor, and Fructidor. Rough literal English translations would be Vintagearious, Fogarious, and Frostarious; Snowous, Rainous, and Windous; Buddal, Floweral, and Meadowal; and Reapidor, Heatidor, and Fruitidor. An Englishman writing in 1800 instead offered the sublimely ridiculous: Wheezy, Sneezy, and Freezy; Slippy, Drippy, and Nippy; Showery, Flowery, and Bowery; Hoppy, Croppy, and Poppy.
Scan of a French Republican calendar from around 1801, titled “Liberte”. Prominently visible are the current month and year; the day of the week associated with each date; and the equivalent Gregorian month, date, and year.
As it turns out, Napoleon abolished the Republican calendar a few years after he declared himself emperor. In the year 14 (1805), France returned to the loving embrace of the Gregorian calendar, save for a brief fling in 79 (1871) under the Paris Commune’s revolutionary government. Today, the calendar is mostly a historical footnote. Its most notable legacy is perhaps the name of the dish Lobster Thermidor, named for a play that was in turn named for the month of July-August.
Emperor Napoleon did not abolish people noticing the Gregorian calendar was stupid and designing replacements for it, which kept happening. What made the French Republican calendar unique was its official adoption as a national calendar. For example, the 1849 Positivist Calendar features months named Moses, Homer, Aristotle, Archimedes, Caesar, St. Paul, Charlemagne, Dante, Gutenberg, Shakespeare, Descartes, Frederick, and, uh, Bichat. Most notable calendar reform movements since then have, boringly but practically, stuck with the Julian month names. The postmodern 1963 Discordian Calendar specifies five seasons named Chaos, Discord, Confusion, Bureaucracy, and The Aftermath.
For more contemporary proposed month names, we have to look farther afield. Inspired by the Martian calendar used in Robert Heinlein’s 1949 science fiction novel Red Planet, aerospace engineer Thomas Gangale devised the Darian Calendar in 1985. A Martian year is 668 sols (or 686 Earth days) long, so the default (and difficult to ridicule) solution of using twelve existing names won’t work. Gangale’s proposal, named for his son Darius, opts for 24 months of 28 sols each, named for the twelve Zodiac constellations and their Sanskrit translations. If you’re familiar with Homestuck, you may notice something familiar about the Sanskrit months of Dhanus, Makara, Kumba, Mina, Misha, Rishabha, Mithuna, Karka, Simha, Kanya, Tula, and Vrishika. The Darian sols of the week, like the Julian and Gregorian day names, are based on the classical planets: Solis, Lunae, Martis, Mercurii, Jovis, Veneris, and Saturni.
Martian month names have been an inspiring creative outlet for many dreamers since 1985. It’s an awfully seductive question. What are twenty-four related names inspired by the future that you envision?
A brief history of time units: Seconds
This is part three of a multi-part series on the origins of the time divisions of a day.
[Part 1: Hours] [Part 2: Minutes]
Seconds
In 1707, four British warships miscalculated their position and ran aground off the Isles of Scilly, killing 2,000 sailors in what is still one of the worst maritime disasters in British history. For lack of a better approach, the way ships calculated their longitude at that time was dead reckoning, basically guessing how far they’d gone from the last known port. The disaster led to the Longitude Act of 1714, establishing a £20,000 (over £4M today) bounty for the deployment of a reliable way to determine a ship’s longitude within 30 minutes (half of a degree).
This is relevant to our history because one theoretical approach to determining longitude was to precisely measure the sun’s position and compare it to solar noon at a known location, such as the Greenwich Royal Observatory. If you knew the exact time, you could calculate your exact longitude from that single observation. The problem with this approach was, no one knew any way to get the exact time at sea. Our marvelous pendulum clocks wouldn’t work on the open ocean, because of the motion of the waves.
The problem lit a fire in the imagination of English clockmaker John Harrison, who presented his first prototype for a marine chronometer to the Royal Academy in 1730. A lifetime of iteration and experimentation resulted in his fourth attempt finally winning the prize in 1761. In its maiden test, his H4 chronometer drifted just 5 seconds over the course of an 81-day voyage, enabling a longitude measurement accurate within 1.25 minutes.
As alluded to earlier, the word second comes from classical Latin secunda, in particular its presence in the Latin phrase pars minūta secunda, meaning “second very small part”. We could have called minutes “primes” or “firsts” instead to maintain internal consistency, but that is not what we did.
Scan of a diagram laying out the inner workings of the H4 chronometer. Several figures illustrating different cross-sections of the clock its mechanisms are labeled with letters.
The delicacy and expense involved in the construction of a device so precise made this solution barely affordable for even the richest country in the world. One chronometer cost one-third the price of an entire ship. Ordinary people wouldn’t have access to clocks with second accuracy until replaceable parts and manufacturing methods made them affordable in the mid-1800s.
The first implementation of the metric system, adopted by French revolutionaries in 1799, only covered base units of distance and mass (meters and kilograms), not time. Carl Gauss proposed using the second, to be officially defined as 1/86,400 of a day, as the standard metric unit of time in his millimeter/milligram/second measurement system in 1832. Over the next thirty years, his proposal found wide approval among scientists. The second was formally adopted as an official metric unit under the centimeter/gram/second measurement system, proposed in 1874 by a committee that included James Maxwell, Lord Kelvin, and James Joule.
Miniaturization followed mass production, resulting in the creation of pocket watches. Widely available second accuracy enabled new levels of coordination unviable just decades earlier. Military watches were first issued to officers in the 1880s. At the beginning of a battle, everyone would synchronize their watches to keep their maneuvers coordinated. In particular, it was important to agree on the exact second to expect an artillery strike. The phrase “synchronize your watches” is an artifact of the second era. You might not know the absolute time, but you could act like you did if you kept everyone’s relative times the same.
If you did in fact need to know the absolute time, observatories began broadcasting official time signals by telegraph in the 1850s. This wasn’t a service the average person would have access to. That begins with the 1924 appearance of radio time stations. Time hotlines that you could telephone for an accurate time signal started operating in 1933, gradually displacing the radio time stations.
Wristwatches, then called “bracelet watches”, were originally a women’s fashion accessory. They only lost their genderedness after military wristwatches entered common use around the world in the 1880s and 1890s. Newly verified as manly, men’s wristwatches became a standard fashion accessory in the 1900s. They remain the most popular men’s jewelry item to this day.