katsu

The culinary term katsu first appears in English, describing a style of curry, in 1976. Its standalone usage to mean just the cutlet begins much later, in 2006. It’s borrowed from Japanese カツ (katsu), meaning “cutlet”. カツ first appears in Japanese in 1928 as a clipping of カツレツ (katsuretsu), also meaning “cutlet”. カツレツ, in turn, first appears in Japanese in 1884, borrowed from English cutlet. Wait, what?!

The international expedition 岩倉使節団 (Iwakura Mission) was an 1871-1873 Japanese diplomatic voyage to Europe and the United States. Its primary goal, renegotiation of the unequal treaties forced on Japan at gunpoint, was not successful. Its secondary goal, learning more about industrialized societies to better enact Japan’s modernization, was wildly successful. One of those secondary results was the popularization of 洋食 (yōshoku), meaning “Western food”. トンカツ (tonkatsu), fried breaded pork cutlets, is regarded as one of the 三大洋食 (sandai yōshoku), meaning “three great Western foods”, dating to this period. The other two are curry rice and croquettes.

In 1852, US President Millard Fillmore assigned Navy Commodore Matthew Perry a mission to open trade with Japan by any means necessary. Since 1633, Japan operated under a policy of 鎖国 (sakoku), meaning “locked country”, closing its borders to expel foreign influence. In 1853, Perry sailed four steam-powered ironclads to the capital of Edo, firing blanks as a show of intimidation, while their iron hulls repelled all counterattacks. The Black Ships incident lives on in Japanese cultural memory as the paradigmatic out-of-distribution event: four invulnerable alien vessels arrive on the horizon, bearing unreasonable demands.

Katsu also has a Japanese homophone of much earlier origin, 喝. This sort of katsu is either a loud exclamation or a Zen Buddhist interjection for scolding. It’s akin to martial arts traditions’ 気合(kiai) shouted while performing an attack. 喝 is a borrowing from Chinese 喝 (hè, but pronounced xat in Middle Chinese), meaning “to shout”.

Photo of a katsu curry served at the Japanese equivalent of a diner. A katsu cutlet, brown curry sauce, white rice, and bright red fukujinzuke are the prominent elements. Photo of a katsu curry served at the Japanese equivalent of a diner. A katsu cutlet, brown curry sauce, white rice, and bright red fukujinzuke are the prominent elements.

Anyway! The culinary term cutlet first appears in English, describing a cut of meat, in 1706. It’s borrowed from French côtelette, meaning “cutlet”. Côtelette, earlier rendered in Middle French as costelette, first appears in French in 1393 as a dimunitive form of Old French coste, meaning “rib” or “side”. Coste is a direct descendant of Classical Latin costa, meaning “rib”, which is of unknown origin. Notably, costa doesn’t have any historical connection to the word or meaning “cut”, with contemporary origin theories including a word that means “bone”.

Folk etymology led to the English spelling cutlet, as the relationship with a cut of meat seems obvious. (A cut of meat is named for being a piece cut from a larger whole, and starts appearing with that meaning in 1591.) Thanks to French, -let is conveniently also an English dimunitive, so a cutlet could be presumed to be named after a little cut of meat.

In 1706, cutlet referred specifically to a cut of veal. The term gradually expanded to encompass pork in the 1800s and chicken in the 1900s. The modern breaded and fried preparation is likely based on the Austrian dish Wiener Schnitzel, with additional influences from Italian and French dishes. Wiener is German for “Viennese”, its city of origin. Schnitzel is German for “cutlet”, with the perfectly parallel construction of Schnitz “a cut-off piece” + -el dimunitive. While we also sometimes call hot dogs wieners (or later, weenies), that is from a clipping of Viennese sausage.

Left behind

I am not immune to propaganda. But I do seem to be more resistant to it than average. Recently, when (not if) the conversation turns to effective use of LLMs, I have been met by genuine surprise that I don’t use them yet. Aren’t I a technologist? Shouldn’t I be worried about getting left behind as a member of the perpetual underclass, or worse, an older adult who doesn’t understand phones?

I am not. This steadfast belief comes from the reason that I became a technologist in the first place. That’s right, it’s my dad.

My dad loved technology. He bought a portable computer in 1986. You would injure yourself if you tried to put that thing on your lap. My childhood memories include a background specter of my dad filming everything with his shoulder-mounted video recorder. The closest visual analogy I can draw on today would be an equally ancient boombox. In 2017, he insisted on buying a shiny new iMac despite my pleas to stick with a secure-by-design, normie-friendly Chromebook or iPad.

My dad was really bad with technology. That’s unfair, actually. He was a great mechanic and woodworker, doing all of our auto maintenance himself except for the really gnarly jobs. He taught me how our lawnmower worked, inside and out. What he was really bad with was computers. Something about the difference between mechanical devices that had one job composed of single-function parts in service of that job, and electronic devices that could do anything you asked them to, made the latter completely incomprehensible to him.

Well, somebody had to make our perpetually growing collection of computers behave. That task naturally fell to me as I got older and more capable. I grew up with an intuitive understanding of how to computer, matched only by an ironic helplessness around mechanical devices. You can just imagine my dad’s frustration at my inability to understand a simple oil change. This computer whispering wasn’t a chance outcome. My parents, immigrant factory workers both, were convinced that computers would be the future and steered me toward them. Before I was born, they committed to this plan to give their children a chance at a better life than their own.

Eliding many years of hard work and lucky breaks, that’s the story of how I came to be this odd duck of a technologist. I’m surrounded by early adopters and innovators who make my early majority inclinations feel like Luddism at times. But my upbringing imbued me with a deep-seated intuition around carefully weighing the pros and cons and waiting until a shiny new thing is more mature before buying in by default. And it imbues me with a useful perspective to have in strategy meetings. I wouldn’t even call it conservative. Just having a moderate voice in the room advocating for simplicity and proven, reliable solutions over what’s hot right now adds a lot of value.

And it seems to be working out fine? I didn’t get a cell phone until 2004. I got a music streaming account in 2021. I still don’t have a tablet or smart TV. And despite my constant anxieties about what people think of me, I don’t think anyone thinks of me as an out-of-touch old fogey. I get bombarded with the same fears about falling behind as everyone else, but I seem to have developed a natural resistance to them. Thanks, Dad. I miss you dearly.

Infinite Lives, 2010: Super Meat Boy

An exemplar of the precision platformer genre it popularized, Super Meat Boy simultaneously hates you and wants you to suffer, and believes in you and wants you to triumph. This shines through in the replay the game shows you at the end of each tiny level. The replay overlays every single attempt you made at the level. It might start with a hundred overlapping meat boys, then viscerally illustrate their winnowing at each deadly obstacle in turn, until only one remains triumphant at the end. It perfectly captures the rush of “I can’t believe how ridiculous that level was!” and “Holy crap I am so awesome!” at the same time. The game turns the single-minded struggle that you have just experienced into art.

This is the first in a series of posts examining video game history through looking at one game I loved from each year, 1978–2027. I’ll add a navigation box here when there is more than one post in the series.

Super Meat Boy was the first game from two-person indie studio Team Meat, although it was a sequel to designer Edmund McMillen’s free 2008 Flash game Meat Boy. McMillen, then 30, had made dozens of free Flash games since 2001. He would later go on to design The Binding of Isaac (2011) and Mewgenics (2026). The other founder of Team Meat, programmer Tommy Refenes, went on to make other games under the Super Meat Boy IP instead. The game was composer Danny Baranowsky’s breakout hit. Devil n’ Bass, which plays when you claw your way out of literal Hell only to arrive in even worse double secret hell, is still in my regular playlist rotation.

Screenshot of a single-screen level from Super Meat Boy. It is set in Hell and features a river of lava and too many moving sawblades. Screenshot of a single-screen level from Super Meat Boy. It is set in Hell and features a river of lava and too many moving sawblades.

Part of what makes Super Meat Boy so compelling is its approach to failure. On death, you immediately reappear at the beginning of the level and can try again right away. McMillen’s signature horrifying nightmare world combines smoothly with the ideal that failure isn’t a big deal. Failure being so common just serves to highlight just how unfair and brutal the world was. It would be another eight years before Celeste (2018) took that novel approach to its logical conclusion.

Another part of the appeal is the game’s focus on speed. Most (but crucially not all!) obstacles are timed so that an all-out sprint can barely clear them. You also get a prominent speed ranking at the end of each level. Even harder versions of the levels are gated behind finishing most of them with a certified Grade A. There are always more tantalizing challenges on offer if you want them. After many of my initial hard-won victories, I found myself immediately restarting the level to do it again, but better. Faster. Cleaner.

Super Meat Boy was originally released on XBox Live Arcade, an apt reminder that 2010 was still early in the indie game renaissance. Its acronym, SMB, was selected as a homage to Super Mario Brothers (1985), the game that first defined the platformer. Super Meat Boy wears its classic video game influences on its sleeve, including warp zones that lead to levels demade for fantasy retro consoles and glitch art that evokes NES and Game Boy memory corruption. The game is also in conversation with its fellow indie precision platformers like VVVVVV (2010) and I Wanna Be The Guy (2007), even featuring guest characters from them.

2010 also saw the release of early mobile hits Angry Birds, Cut the Rope, and Fruit Ninja. The app store pricing race to the bottom was well underway. All three games were initially sold as standalone products with no in-app purchases for $1, which they did in fact make up for in volume.

A 100-year-old website

Fifteen years ago, a conference talk planted an idea in my brain that I haven’t been able to uproot since. What things would we do differently if one of the success criteria of a web project was that it should be readable 100 years from now? It’s strange that this question feels as radical as it does. We can easily read 100-year-old books, watch 100-year-old movies, and listen to 100-year-old songs. Isn’t it interesting that the idea of people browsing 100-year-old websites in 2126 seems so wild in comparison?

The web is already designed to be radically backwards compatible. You can visit a website that was last edited in the 1900s and interact with it basically the same way you would have back then. Even wilder, you can start up a computer that was built in the 1900s, insert a Netscape Navigator disk image of appropriate vintage, and point it at a contemporary website, and watch it mostly kind of work. Countless engineer-hours have been dedicated to maintain these unsung miracles. So what’s the problem? Can’t we just trust our collectively galaxy-brained engineers and archivists to find a way to preserve as much of this moment in amber as we can, so that future generations of historians can have all the background context they need to debate the literary merits of Homestuck?

When put in those terms, my problem sounds like more of a personal problem. I bet the likes of Substack and YouTube will be fully archived, indexed, and preserved in their original formats for historians willing to brave the brainrot. If I only cared about being legible to gen-delta historians, I’d just stick to the major platforms and stop expending mental energy on this ridiculous problem.

But I also bet we can do better. I’ve spent an entire career understanding how computers work from resistors to css. Shouldn’t I be able to use this hard-won knowledge to figure out how to construct something all my own that might stand the test of time? Furthermore, shouldn’t I share the results of my quixotic quest so that people have a chance to collectively improve on my half-baked ideas?

So here they are. By far my biggest self-imposed restriction is: all of my artistic work to date is entirely client-side. That means that there is no database, and therefore no persistent storage across computers. This sounds like a huge limitation, and it is! But you can do a lot more than you might expect without an internet database. For example, you can do everything that a computer could do in 1995, but seamlessly accessible from any device anywhere, as long as it has a web browser. You can use local storage to let people save their place and preferences, despite the lack of a central server.

If your work is entirely client-side, that then means you can distribute it as bare files. You don’t have to figure out how to bundle an executable, restart a server when it hangs, or figure out how to get a server to page you when it hangs. You can just spin up an FTP server, or a modern equivalent with a nicer UI — I currently use GitHub Pages — and put the files there, then point a URL at that bare directory. For local testing, you can spin up a minimal web server pointed at the directory with python3 -m http.server and test on that.

I believe the tech stack most likely to work in 100 years is the smallest possible tech stack. For simple projects, I tend to stick with vanilla JavaScript or TypeScript, which are far nicer to work in than they were just five years ago. Here’s an example. For more complex projects, I like the balance of simplicity, convenience, and portability that a Svelte app compiled down to a static site offers. Here’s an example of that.

One outcome of this process I can proudly point to now is this art project. It has now been publicly available and functional for eight years. In that entire time, I have spent zero (0) hours of maintenance on it. I’ve spent enough time around tech art spaces and their graveyards of dead project links to appreciate just how rare this actually is.

I do have to admit that, even with all of that accounted for, 100 years is probably not realistic. Hosting providers will go out of business, domain renewals cost money, and there’s a new norm around automatically deactivating your accounts when you die. So I mentally target 50 years. Far enough in the future that I expect to be dead, but for a short enough time that my infrastructure providers might not also have died. And I willingly choose to relinquish my agency, and trust in the archivists to handle the next 50 years after that.

To whom it may concern in 2076, I hope this email blog post finds you well…

Why are we so bad at reasoning about randomness?

I claim that patternless randomness, the kind generated by coins, dice, cards, and cryptographically secure pseudorandom number generators, has no precedent in the natural world. In this essay I will

Whoops, I didn’t mean to start a new paragraph there, that’s weird. In this essay I will argue the very mechanisms that made humans unreasonably effective at shaping the world to our liking also make it really counterintuitive to work with the kind of patternless randomness we often encounter today. You can clearly develop intuitions around randomness, but it’s a skill we have to deliberately practice, where untrained people will confidently make really bad predictions.

My core argument is that in nature, for every random-seeming event, there are always actually patterns behind it that you can identify and exploit. Were you just attacked by a bear? That’s a random freak occurrence! But over generations of people getting attacked by bears, they can work out that bears are more likely to live in and therefore be encountered in particular environments, tend to be more aggressive when you exhibit certain behaviors and odors, can be warned of in advance by droppings and tree markings, don’t show up at all in the winter for some reason, and so on.

In other words, every event that seems totally random is actually part of a pattern that you can learn. In a state of nature, you should always assume that every event has legible, understandable causes. Lightning is more likely to strike tall trees standing alone. The air feels different before it rains. Cooking food over a fire makes it taste better most of the time, and here’s what to do with the exceptions. Learning the pattern behind these patterns rewards you a with better chance of survival. If they learn to trust the expertise of people like you, the tribe you find yourself in has a correspondingly better chance of flourishing. For millennia, survival of the fittest has selected for minds and societies that see patterns in everything.

What biases might you expect to see in organisms that have evolved to recognize patterns? If you look at misconceptions most people have about randomness, a common theme is that they expect random samples to not be independent. The gambler’s fallacy is expecting an outcome, like red, that hasn’t come up recently to be more likely because it’s “due”. The opposite expectation can be seen in sports: the belief that if a player’s made a few shots in a row, they’re on a hot streak and more likely to make the next one. It feels like knowing past results should somehow influence your predictions of future results.

Sometimes people get mad about video games. One thing they might get mad about is if something that has a 50% chance of success fails five times in a row. This is clearly unfair and rigged. Sitting back here in our armchairs, we can dispassionately calculate the odds of this at 1/32, or 3% for every distinct set of five flips, and see that we should expect outcomes like this pretty often if you’re flipping that coin 500 times. So one thing games often implement to bring randomness more in alignment with expectations is a streakbreaker that nudges the odds in the player’s favor on repeated failures. A streakbreaker for successes is never considered, because knowing of its mere existence would cause players to claim the RNG is clearly unfair and rigged, and people don’t tend to notice anything strange if they succeed at five 50% flips in a row.

In conclusion, I think there’s a huge untapped market opportunity in certifiably organic random number generators. Just imagine, an octopus that predicts