MoreRSS

site iconThe Intrinsic PerspectiveModify

By Erik Hoel. About consilience: breaking down the disciplinary barriers between science, history, literature, and cultural commentary.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of The Intrinsic Perspective

The Internet You Missed: A 2025 Snapshot

2025-08-13 22:45:16

There are many internets. There are internets that are bright and clean and whistling fast, like the trains in Tokyo. There are internets filled with serious people talking as if in serious rooms, internets of gossip and heart emojis, and internets of clowns. There are internets you can only enter through a hole under your bed, an orifice into which you writhe.

It’s a chromatic thing that can’t hold a shape for more than an instant. But every year, I get to see the internet through the eyes of subscribers to The Intrinsic Perspective. The community submits its writing available online, and I curate and share it.

The quality was truly exceptional this year—I found that they all speak for themselves, and can all be approached on their own terms, so I organized them to highlight how each is worth reading, thinking about, disagreeing with, or simply enjoying; at the very least, they are worth browsing through at your leisure, and finding hidden gems of writers to follow.

Please note that:

  • I cannot fact check each piece, nor is including it an official endorsement of its contents.

  • Descriptions of each piece, in italics, were written by the authors themselves, not me (but sometimes adapted for readability). What follows is from the community. I’m just the curator here.

  • I personally pulled excerpts and images from each piece after some thought, to give a sense of them.

  • If you submitted something and it’s missing, note that it’s probably in an upcoming Part 2.

So here is their internet, or our internet, or at least, the shutter-click frozen image of one possible internet.


1. “Wisdom of Doves” by Doctrix Periwinkle.

Evolved animal behaviors are legion, so why do we choose the examples we do to explain our own?

According to psychologist Jordan Peterson, we are like lobsters.We are hierarchical and fight over limited resources….

Dr. Peterson is a Canadian, and he is describing the North Atlantic American lobster, Homarus americanus. Where I live, lobsters are different.

For instance, they do not fight with their claws, because they do not have claws…. Because they do not have claws, spiny lobsters (Panulirus argus) are preyed upon by tropical fish called triggerfish…. The same kind of hormone signaling that made American lobsters exert dominance and fight each other causes spiny lobsters to cluster together to fight triggerfish, using elaborately coordinated collective behavior. Panulirus lobsters form choreographed “queues,” “rosettes,” and “phalanxes” to keep each other safe from the triggerfish foe. Instead of using claws to engage in combat with other lobsters, spiny lobsters use their attenules—the spindly homologues of claws seen in the photograph above—to keep in close contact with their friends….

If you are a lobster, what kind of lobster are you?


2. “We Know A Good Life When We See It” by Matt Duffy.

A reflection on how fluency replaced virtue in elite culture, and why recovering visible moral seriousness is essential to institutional and personal coherence.

We’ve inherited many of the conditions that historically enabled virtue—stability, affluence, access, mobility—but we’ve lost the clarity on virtue itself. The culture of technocratic primacy rewards singularity: total, often maniacal, dedication to one domain at the expense of the rest…. Singular focus is not a human trait. It is a machine trait. Human life is fragmented on purpose. We are meant to be many things: friend, worker, parent, neighbor, mentor, pupil, citizen.


3. The Vanishing of Youth” by Victor Kumar, published in Aeon.

Population decline means fewer and fewer young people, which will lead to not just economic decay but also cultural stagnation and moral regress.

Sometimes I’m asked (for example, by my wife) why I don’t want a third child. ‘What kind of pronatalist are you?’ My family is the most meaningful part of my life, my children the only real consolation for my own mortality. But other things are meaningful too. I want time to write, travel and connect with my wife and with friends. Perhaps I’d want a third child, or even a fourth, if I’d found my partner and settled into a permanent job in my mid-20s instead of my mid-30s… Raising children has become enormously expensive – not just in money, but also in time, career opportunities and personal freedom.


4. “Three tragedies that shape human life in age of AI and their antidotes”, by brothers Manh-Tung Ho & Manh-Toan Ho, published in the journal AI & Society.

In this paper, we [the authors] discuss some problems arising in the AI age, and then, drawing from both Western and Eastern philosophical traditions to sketch out some antidotes. Even though this was published in a scientific journal, we published in a specific section called Curmudgeon Corner. According to the journal it "is a short opinionated letter to the editor on trends in technology, arts, science and society, commenting emphatically on issues of concern to the research community and wider society, with no more than 3 references and 2 co-authors.”

The tragedy of the commons is the problem of inner group conflicts driven by the lack of cooperation (and communication) when each individual purely follows his/her own best interest (e.g., raises more cattle to feed on the commons), doing so will undermine the collective good (e.g., the commons will be over-grazed). Thus, we define the AI-driven tragedy of the commons as short-term economic/psychological gains that drive the development, launch, and use of half-baked AI products and AI-generated contents that produce superficial information and knowledge, which ends up harming the individual and collective in the long term.


5. "Of Mice, Mechanisms, and Dementia" by Myka Estes.

Billions spent, decades lost: the cautionary tale of how Alzheimer’s research went all-in on a bad bet.

Another way to understand how groundbreaking these results were thought to be at the time is to simply follow the money. Within a year, Athena Neurosciences, where Games worked, was acquired by Elan Corp. for a staggering $638 million. In the press release announcing the merger, Elan proclaimed that the acquisition “provides the opportunity for us to capitalize on an important therapeutic niche, by combining Athena’s leading Alzheimer’s disease research program with Elan’s established development expertise.” The PDAPP mouse had transformed from laboratory marvel to the cornerstone of a billion-dollar strategy.

But, let’s peer ahead to see how that turned out. By the time Elan became defunct in 2013, they had sponsored not one, not two, but four failed Alzheimer's disease therapeutics, all based on the amyloid cascade hypothesis, hemorrhaging $2 billion in the process. And they weren't alone. Pharmaceutical giants, small biotechs, and research organizations and foundations placed enormous bets on amyloid—bets that, time and again, failed to pay off.


6. “Schrödinger's Chatbot” by R.B. Griggs.

Is an LLM a subject, an object, or some strange new thing in between?

It would be easy to insist that LLMs are just objects, obviously. As an engineer I get it—it doesn’t matter how convincing the human affectations are, underneath the conversational interface is still nothing but data, algorithms, and matrix multiplication. Any projection of subject-hood is clearly just anthropomorphic nonsense. Stochastic parrots!

But even if I grant you that, can we admit that LLMs are perhaps the strangest object that has ever existed?


7. "A Prodigal Son" by Eva Shang.

My journey back to Christianity and why it required abandoning worship of the world.

How miserable is it to believe only in the hierarchy of men? It’s difficult to overstate the cruelty of the civilization that Christianity was born into: Roman historian Mary Beard describes how emperors would intentionally situate blind, crippled, or diseased poor people at the edges of their elaborate banquets to serve as a grotesque contrast to the wealth and health of the elite. The strong did what they willed and the weak suffered what they must. Gladiatorial games transformed public slaughter into entertainment. Disabled infants were left to die in trash heaps or on hillsides. You see why the message of Christ spread like wildfire. What a radical proposition it must have been to posit the fundamental equality of all people: that both the emperor and the cripple are made in the image of God.


8. “Why Cyberpunk Matters” by C.W. Howell.

Though the genre is sometimes thought dated, cyberpunk books, movies, and video games are still relevant. They form a last-ditch effort at humanism in the face of machine dominance.

So, what is it that keeps drawing us to this genre? It is more, I believe, than simply the distinct aesthetic…. It reflects, instead, a deep-seated and long-standing anxiety that modern people feel—that our humanity is at stake, that our souls are endangered, that we are being slowly turned into machines.


9.You Are So Sensitive” by Trevy Thomas.

This piece is about the 25 percent of our population, myself the author included, who have a higher sensitivity to the world around us -- with both good and bad effects.

As a young girl, I could ride in a car with my father and sing along to every radio song shamelessly loud. He was impressed that I knew all the words even as the musician in him couldn’t help but critique the song itself. “Why does every song have the word ‘baby’ in it?” he’d ask. But then I got to a point where I’d leave a store or promise never to return to a restaurant because of the music I’d heard in it. Some song from that place would be so lodged in my brain that it would wake me in the middle of the night two weeks later…. about a quarter of the population—humans and animals alike—have this increased level of sensitivity. It can show up in various forms, including sensitivity to sound, light, smell, and stimulation.


10. “Solving Popper's Paradox of Tolerance Before Intolerance Ends Civilization” by Dakara.

A solution to preserving the free society without invoking the conflict of Popper's Paradox.

… Are we now witnessing the end of tolerant societies? Is this the inevitable result that eventually unfolds once an intolerant ideology enters the contest for ideas and the rights of citizens?…

Have we already reached the point where the opposing ideologies are using force against the free society? They censor speech, intervene in the employment of those they oppose, and will utilize physical violence for intimidation.


11. “Knowledge 4.0” by Davi.

From gossip to machine learning - how we bypassed understanding.

Speech allowed us to transmit knowledge among humans, the written word enabled us to broadcast it across generations, and software removed the cost of accessing that knowledge, while turbocharging our ability of composing any piece of knowledge we created with the existing humanity-level pool. What we call now machine learning came to remove one of the few remaining costs in our quest of conquering the world: creating knowledge. It is not that workers will lose their jobs in the near future, this is the revolution that will make obsolete much of our intellectual activity for understanding the world. We will be able to craft planes without ever understanding why birds can fly.


12. “Problematic Badass Female Tropes” by Jenn Zuko.

An overview of the PBFT series of 7 that covers the bait-and-switch of women characters that are supposed to be strong, but end up subservient or weak instead.

The problem that becomes apparent here (as I’m sure you’ve noticed even in only this first folktale example), is that in today’s literature and entertainment, these strong, independent women characters we read about in old stories like Donkeyskin and clever Catherine are all too often subverted, altered, and weakened; either in subtle ways or obvious ways, especially by current pop culture and Hollywood.


13. "The West is Bored to Death" by Stuart Whatley, published in The New Statesman.

An essay on the classical "problem of leisure," and how a society/culture that fails to cultivate a leisure ethic ends up in trouble.

Developing a healthy relationship with free time does not come naturally; it requires a leisure ethic, and like Aristotelian virtue, this probably needs to be cultivated from a young age. Only through deep, sustained habituation does one begin to distinguish between art and entertainment, lower and higher pleasures, titillation and the sublime.


14. “MAGA As The Liberal Shadow” by Carlos.

In a very real sense, liberalism is the root cause of MAGA, and it's very important to understand this to see a way forward.

It’s no wonder that I feel liberalism as the source of this eternal no: it is liberals who define the collective values of our culture, as it is the cities that produce culture, and the cities are liberal. So the voice of the collective in my head, is a liberal. My little liberal thought cop, living in my head.

4chan is great because you get to see what happens when someone evicts the liberal cop, the shadow run rampant. Sure, all sorts of very naughty emotions get expressed, and it is quite a toxic place, but it’s like a great sigh, finally, you can unwind, and say whatever the fuck you want, without having to take anyone else’s feelings into account.


15.The Blowtorch Theory: A New Model for Structure Formation in the Universe” by Julian Gough.

The James Webb Space Telescope has opened up a striking and unexpected possibility: that the dense, compact, early universe universe wasn't shaped slowly and passively by gravity alone, but was instead shaped rapidly and actively by sustained, supermassive black hole jets, which carved out the cosmic voids, shaped the filaments, and generated the magnetic fields we see all around us today.

An evolved universe, therefore, constructs itself according to an internal, evolved set of rules baked deep into its matter, just as a baby, or a sprouting acorn, does.

The development of our specific universe, therefore, since its birth in the Big Bang, mirrors the development of an organism; both are complex evolved systems, where (to quote the splendid Viscount Ilya Romanovich Prigogine), the energy that moves through the system organises the system.

But universes have an interesting reproductive advantage over, say, animals.


16. “Tea” by Joshua Skaggs.

Joshua Skaggs, a single foster dad, has a 3 a.m. chat with one of his kids.

My second night as a foster dad I wake in the middle of the night to the sound of footsteps. I throw on a t-shirt and find him pacing the living room, a teenager in basketball shorts and a baggy t-shirt….

“I broke into your closet,” he says.

“Oh yeah?” I say….

“I looked at all your stuff,” he says. “I thought about drinking your whiskey, but then I thought, ‘Nah. Josh has been good to me.’ So I just closed the door.”

I’m not sure what to say. I eventually land on: “That’s good. I’m glad you didn’t take anything.”

“It was really easy to break into,” he says. “It only took me, like, three seconds.”

“Wow. That’s fast.”

“I’m really good at breaking into places.”


17. “Notes in Aid of a Grammar of Assent” by Amanuel Sahilu.

Through the twin lenses of literature and science, I take a scanning look at the human tendency to detect and discern personhood.

This is all to say, a main reason for modern skepticism toward serious personification is that we think it’s shoddy theorizing….

But I think few moderns reject serious personification on such rational grounds. It may be just as likely we’re driven to ironic personification after adjusting to the serious form as children, when we’re first learning about language and the world. Then as we got older the grown-ups did a kind of bait-and-switch, and serious personification wasn’t allowed anymore.


18. “Book Review: Griffiths on Electricity & Magnetism” by Tim Dingman.

In adulthood I have read many STEM textbooks cover-to-cover.These are textbooks that are supposed to be standards in their fields, yet most of them are not great reading. The median textbook is more like a reference manual with practice problems than a learning experience.

Given the existence and popularity of nonfiction prose on any number of topics, isn’t it odd that most textbooks are so far from good nonfiction? We have all the pieces, why can’t we put them together? Or are textbooks simply not meant to be read?

Certainly most students don’t read them that way. They skim the chapters for equations and images, mostly depend on class to teach the ideas, then break out the textbook for the problem set and use the textbook as reference material. You don't get the narrative that way.

Introduction to Electrodynamics by David Griffiths is the E&M textbook. We had it in my E&M class in college…. Griffiths is so readable that you can read it like a regular book, cover to cover.


19. “Fine Art Sculpture in the Age of Slop” by Sage MacGillivray.

Exploring analogue wisdom in a digital world: Lessons from a life in sculpture touching on brain lateralization, deindustrialization, Romanticism, AI, and more.

… As Michael Polanyi pointed out, it only takes a generation for some skills to be lost forever. We can’t rely on text to retain this knowledge. The concept of ‘stealing with your eyes’, which is common in East Asia, points to the importance of learning by watching a master at work. Text (and even verbal instruction) is flattening….

These days, such art studio ‘laboratories’ are hard to find. Not only is the environment around surviving studios more sterile and technocratic, but artists increasingly outsource their work to a new breed of big industry: the large art production house. A few sketches, a digital model, or perhaps a maquette — a small model of the intended work — are shared with these massive full-service shops that turn sculpture production from artistic venture into contract work. As the overhead cost of running a studio has increased over time, this big-shop model of outsourcing is often the only viable model for artists who want to produce work at scale….

And just like a big-box retailer can wipe out the local hardware store, the big shop model puts pressure on independent studios that train workers in an artisanal mode and allow the artist to evolve the artwork throughout the production process.


20. “Setting the Table for Evil” by Reflecting History.

About the role that ideology played in the rise and subsequent atrocities of Nazi Germany, and the historical debate between situationism and ideology in explaining evil throughout history.

Some modern “historians” have sought to uncouple Hitler’s ideology from his actions, instead seeking to paint his “diplomacy” and war making as geopolitical reactions to what the Allies were doing. But Hitler’s playbook from the beginning was to connect the ideas of racist nationalism and extreme militarism together, allowing each to justify the existence of the other. Nazi Germany’s war was more than just geopolitical strategic war-making chess, it was conquest and subjugation of racial enemies. The British leadership were “Jewish mental parasites,” the conquest of Poland was to “proceed with brutality!… the aim is the removal of the living forces...,” the invasion of the Soviet Union sought to eliminate “Jewish Bolsheviks,” the war with the United States was fought against President Roosevelt and his “Jewish-plutocratic clique.” Hitler applied his ideology to his conquest and subjugation of dozens of countries and peoples in Europe. He broke nearly every international agreement he ever made, and viewed treaties and diplomacy as pieces of paper to be shredded and stepped over on the way to power. Anyone paying attention to what Hitler said or did in 1923 or 1933 or 1943 had to reckon with the fact that Hitler’s ideology informed everything he did.


21. “Which came first, the neuron or the feeling?” by Kasra.

A reverie on the history and philosophy behind the mind-body problem.

Researchers simulate an entire fly brain on a laptop. Is a human brain  next? - Berkeley News
Every neuron in a fruit fly brain

… I do know that life gets richer when you contemplate that either one of these—the neuron and the feeling—could be the true underlying reality. That your feelings might not just be the deterministic shadow of chemicals bouncing around in your brain like billiard balls. That perhaps all self-organizing entities could have a consciousness of their own. That the universe as a whole might not be as dark and cold and empty as it seems when we look at the night sky. That underneath that darkness might be the faintest glimmer of light. Of sentience. A glimmer of light which turns back on itself, in the form of you, asking the question of whether the neuron comes first or the feeling.


22. “Dying to be Alive: Why it's so hard to live your unlived life and how you actually can” by Jan Schlösser.

Exploring the question of why we all act as if we were immortal, even though we all know on an intellectual level that we're going to die.

Becker states that humans are the only species who are aware of their mortality.

This awareness conflicts with our self-preservation instinct, which is a fundamental biological instinct. The idea that one day we will just not exist anymore fills us with terror – a terror that we have to manage somehow, lest we run around like headless chickens all day (hence ‘terror management’).

How do we manage that terror of death?

We do it in one of two ways:

  1. Striving for literal or symbolic immortality

  2. Suppressing our awareness of our mortality


23. “Thirst” by Vanessa Nicole.

Connecting Viktor Frankl’s idea of “the existence of thirst implies the existence of water,” to choosing to live with idealism and devotion.

This is, essentially, how I define being idealistic: a devotion to thirst and belief in the existence of water. To me, idealism isn’t about a hope for a polished utopia—it’s in believing that fulfillment can transform, from an abstract emptiness into the pleasantly refreshed taste in your mouth. (And anyway, there’s a whole universe between parched and utopia.)


24. “A god-sized hole” by Iuval Clejan.

A modern interpretation of Pascal's presumptuous phrase (about a god-sized hole).

People get to feel good about themselves by working hard at something that they get paid for. It also gives them social legitimacy. For some it offers a means of connection with other humans that is hard to achieve outside of work and church. For a few lucky ones it offers a way to express talent and passion. But for most it is an attempt to fill the tribe, family and village-sized holes of their souls.


25. “Have 'Quasi-Inverted Spectrum' Individuals Fallen into Our World, Unbeknownst to Us?” by Ning DY.

Drawing on inconsistencies in neuroimaging and a re-evaluation of first-person reports, this essay argues that synesthesia may not be a cross-activation of senses, but rather a fundamental, 'inverted spectrum-like' phenomenon where one sensory modality's qualia are entirely replaced by another's due to innate properties of the cortex.

I wonder, have we really found individuals similar to those in John Locke's 'inverted spectrum' thought experiment (though different from the original, as this is not a symmetrical swap but rather one modality replacing another)? Imagine if, from birth, our auditory qualia disappeared and were replaced by visual qualia, changing the experienced qualia just as in the original inverted spectrum experiment. How would we describe the world? Naturally, we would use visual elements to name auditory elements, starting from the very day we learned to speak. As for the concepts described by typical people, like pitch, timbre, and intensity, we would need to learn them carefully to cautiously map these concepts to the visual qualia we "hear." Perhaps synesthetes also find us strange, wondering why we give such vastly different names to two such similar experiences?


26. “Elementalia: Chapter I Fire” by Kanya Kanchana.

Drawing from the vast store of our collective imagination across mythology, philosophy, religion, literature, science, and art, this idiosyncratic, intertextual, element-bending essay explores the twined enchantments of fire and word.

My legs and feet are bare—no cloth, no metal, not even nail polish. Strangely, my first worry is that it feels disrespectful to step on life-giving fire. Then I see a mental image of a baby in his mother’s arms, wildly kicking about—but she’s smiling. I better do this before I think too much. I step on the coals. I feel a buzz go up my legs like invisible electric socks but it doesn’t burn. It doesn’t burn.

I don’t run; I walk. I feel calm. I feel good. When I get to the other side, I grin at my friends and turn right around. I walk again.


27. “When Scientists Reject the Mathematical Foundations of Science” by Josh Baker.

By directly observing emergent mechanical behaviors in muscle, I have discovered the basic statistical mechanics of emergence, which I describe in a series of posts on Substack.

Over the past several years, six of these manuscripts were back-to-back triaged by editors at PNAS. Other lower tier journals rejected them for reasons ranging from “it would overturn decades of work” and “it’s wishful thinking” to reasons unexplained. An editorial decision in the journal Entropy flipped from a provisional accept to reject followed by radio silence from the journal.

A Biophysical Journal advisory board rejected some of these manuscripts. In one case, an editor explained that a manuscript was rejected — not because the science was flawed but — because the reviewers they would choose would reject it with near certainty.


28. "The Tech is Terrific, The Culture is Cringe" by Jeff Geraghty.

A fighter test pilot and Air Force General answers a challenge put to him directly by Elon Musk.

On a cool but sunny day in May of 2016, in his SpaceX facility in Redmond, Washington, Elon Musk told me that he regretted putting so much technology into the Tesla Model X. His newest model was rolling out that year, and his personal involvement with the design and engineering was evident. If he had it to do over again, he said, he wouldn’t put so much advanced technology into a car….

Since that first ride, I’ve been watching the car drive for almost a year now, and I’m still impressed…

My daughter, however, wouldn’t be caught dead in it.She much prefers to ride the scratched up old Honda Odyssey minivan. She has an image to uphold, after all.


29. “The Lamps in our House: Reflections on Postcolonial Pedagogy” by Arudra Burra.

In this sceptical reflection on the idea of 'decolonizing' philosophy, I question the idea that we should think of the 'Western philosophical tradition' as in some sense the exclusive heritage of the modern West; I connect this with what I see as certain regrettable nativist impulses in Indian politics and political thought.

I teach philosophy at the Indian Institute of Technology-Delhi. My teaching reflects my training, which is in the Western philosophical tradition: I teach PhD seminars on Plato and Rawls, while Bentham and Mill often figure in my undergraduate courses.

What does it mean to teach these canonical figures of the Western philosophical tradition to students in India?… Some of the leading lights of the Western canon have views which seem indefensible to us today: Aristotle, Hume, and Kant, for instance. Statues of figures whose views are objectionable in similar ways have, after all, been toppled across the world. Should we not at least take these philosophers off their pedestals? …

The Indian context generates its own pressures. A focus on the Western philosophical tradition, it is sometimes thought, risks obscuring or marginalising what is of value in the Indian philosophical tradition. Colonial attitudes and practices might give us good grounds for this worry; recall Macaulay’s famous lines, in his “Minute on Education” (1835), that “a single shelf of a good European library [is] worth the whole native literature of India and Arabia.”


30. “What Happens When We Gamify Reading” by Mia Milne.

How reading challenges led me to prioritize reading more over reading deeply and how to best take advantage of gamification without getting swept away by the logic of the game.

The attention economy means that we’re surrounded by systems designed to suck up our focus to make profit for others. Part of the reason gamification has become so popular is to help people do the things they want to do rather than only do the things corporations want them to do.


31.Pan-paranoia in the USA” by Eponynonymous.

A brief history of the "paranoid style" of American politics through a New Romantic lens.

As someone who once covered the tech industry, I join in Ross Barkan’s wondering what good these supposed marvels of modern technology—instantaneous communication, dopamine drips of screen-fed entertainment, mass connectivity—have really done for us. Are we really better off? ….

But we are also facing a vast and deepening suspicion of power in all forms. Those suspicions need not be (and rarely are) rationally obtained. The old methods of releasing societal pressures—colonialism, western expansionism, post-war consumerism—have atrophied or died. It should come as no surprise when violence manifests in their place.

GPT-5's debut is slop; Will AI cause the next depression? Harvard prof warns of alien invasion; Alpha School & homeschool heroes

2025-08-06 23:26:15

The Desiderata series is a regular roundup of links and thoughts for paid subscribers, and an open thread for the community.

Subscribe now


Contents:

  1. GPT-5’s debut is slop.

  2. 10% of all human experience took place since the year 2000.

  3. Education is a mirror. What’s Alpha School’s reflection?

  4. The rise of the secular homeschool superheroes.

  5. “The Cheese that Gives you Nightmares.”

  6. Avi Loeb at Harvard warns of alien invasion.

  7. Moths as celestial navigators.

  8. Will AI cause the next depression?

  9. From the archives.

  10. Comment, share anything, ask anything.


1. GPT-5’s debut is slop.

GPT-5’s launch is imminent. Likely tomorrow. We also have the first confirmed example of an output known for sure to be from GPT-5, which was shared by Sam Altman himself as a screenshot on social media. He asked GPT-5 “what is the most thought-provoking show about AI?”

Hmmm.

Hmmmmmmmmm.

Yeah, so #2 is a slop answer, no?

Maybe even arguably a hallucination. Certainly, that #2 recommendation, the TV show Devs, does initially seem like a good answer to Altman’s question, in that it is “prestige sci-fi” and an overall high-quality show. But I’ve seen Devs. I’d recommend it myself, in fact (streaming on Hulu). Here’s the thing: Devs is not a sci-fi show about AI! In no way, shape, or form, is it a show about AI. In fact, it’s refreshing how not about AI it is. Instead, it’s a show about quantum physics, free will, and determinism. This is the main techno-macguffin of Devs: a big honking quantum computer.

Quantum nature of Devs - fxguide
spoilers for the first 30 minutes?

As far as I can remember, the only brief mention of AI is how, in the first episode, the main protagonist of that episode is recruited away from an internal AI division of the company to go work on this new quantum computing project. Now, what’s interesting is that GPT-5 does summarize the show appropriately as being about determinism and free will and existential tension (and, by implication, not about AI). But its correct summary makes its error of including Devs on the list almost worse, because it shows off the same inability to self-correct that LLMs have struggled with for years now. GPT-5 doesn’t catch the logical inconsistency of giving a not-AI-based description of a TV show, despite being specifically asked for AI-based TV shows (there’s not even a “This isn’t about AI, but it’s a high-quality show about related subjects like…”). Meaning that this output, the very first I’ve seen from GPT-5, feels extremely LLM-ish, falling into all the old traps. Its fundamental nature has not changed.

This is why people still call it a “stochastic parrot” or “autocomplete,” and it’s also why such criticisms, even though weaker in strength, can’t be entirely dismissed. Even at GPT-5’s incredible level of ability, its fundamental nature is still that of autocompleting conversations. In turn, autocompleting conversations leads to slop, exactly like giving Devs as a recommendation here. GPT-5 is secretly answering not Altman’s question, but a different question entirely: when autocompleting a conversation about sci-fi shows and recommendations, what common answers crop up? Well, Devs often crops up, so let’s list Devs here.

Judge GPT-5’s output by honest standards. If a human said to me “There’s this great sci-fi show about AI, you should check it out, it’s called Devs,” and then I went and watched Devs, I would spend the entire time waiting for the AI plot twist to make an appearance. At the series end, when the credits rolled, I would be 100% certain that person was an idiot.

Subscribe now


2. 10% of all summed human experience took place since the year 2000.

According to a calculation by blogger Luke Eure, 50% of human experience (total experience hours by “modern humans”) has taken place after 1300 AD.

Which would mean that 10% of collective human experience has occurred since the year 2000! It also means that most of us now alive will live, or have lived, alongside a surprisingly large chunk of when things are happening (at least, from the intrinsic perspective).


3. Education is a mirror. What’s Alpha School’s reflection?

In the education space, the buzz right now is around Alpha School. Their pitch (covered widely in the media) is that they do 2 hours of learning a day with an “AI tutor.”

More recently, The New York Times profiled them:

At Alpha’s flagship, students spend a total of just two hours a day on subjects like reading and math, using A.I.-driven software. The remaining hours rely on A.I. and an adult “guide,” not a teacher, to help students develop practical skills in areas such as entrepreneurship, public speaking and financial literacy.

I’ll say upfront: I do believe that 2 hours of learning a day, if done well, could be enough for an education. I too think kids should have way more free time than they do. So there is something to the model of “2 hours and done” that I think is attractive.

But I have some questions, as I was one of the few actual attendees to the first “Alpha Anywhere” live info session, which revealed details of how their new program for homeschoolers works. Having seen more of it, Alpha School appears based on progressing through pre-set educational apps, and doesn’t primarily involve AI-as-tutor-qua-tutor often (i.e., interacting primarily with an AI like ChatGPT). While the Times says that

But Alpha isn’t using A.I. as a tutor or a supplement. It is the school’s primary educational driver to move students through academic content.

all I saw was one use case, which was AI basically making adaptive reading comprehension tests on the fly (I think that specifically is actually a bad idea, and it looked like reading boring LLM slop to me).

For this reason, the more realistic story behind Alpha School is not “Wow, this school is using AI to get such great results!” but rather that Alpha School is “education app stacking” and there are finally good enough, and in-depth enough, educational apps to cover most of the high school curriculum in a high-quality and interactive way. That’s a big and important change! E.g., consider this homeschooling mom, who points out that she was basically replicating what Alpha School is doing by using a similar set of education apps.

Most importantly, and likely controversially, Alpha School pays the students to progress through the apps via an internal currency that can be redeemed for goodies (oddly, this detail is left out from the analysis of places like the Times—but hey, it’s “the paper of record,” right?).

My thoughts are two-fold. First, I do think it’s true that ed-apps have gotten good enough to replace a lot of the core curriculum and allow for remarkable acceleration. Second, I think it’s a mistake to separate the guides from the learning itself. That is, it appears the actual academics at Alpha School are self-contained, as if in a box; there’s a firewall between the intellectual environment of the school and what’s actually being learned during those 2 hours on the apps. Not to say that’s bad for all kids! Plenty of kids ultimately are interested in things beyond academics, and sequestering the academics “in a box” isn’t necessarily bad for them.

However, it’s inevitable that this disconnect makes the academics fundamentally perfunctory (to be fair, this is true for a lot of traditional schools as well). As I once wrote about the importance of human tutors:

Serious learning is socio-intellectual. Even if the intellectual part were to ever get fully covered by AI one day, the “socio” part cannot… just like how great companies often have an irreducibly great culture, so does intellectual progress, education, and advancement have an irreducible social component.

Now, I’m sure that Alpha School has a socio-intellectual culture! It’s just that the culture doesn’t appear to be about the actual academics learned during those 2 hours. And that matters for what the kids work on and find interesting themselves. E.g., in the Times we get an example of student projects like “a chatbot that offers dating advice,” and in Fox News another example was an “AI dating coach for teenagers,” and one of the cited recent accolades of Alpha School students is placing 2nd in some new high school competition, the Global AI Debates.

At least in terms of the public examples, a lot of the most impressive academic/intellectual successes of the kids at Alpha School appear to involve AI. Why? Because the people running Alpha School are most interested in AI!

And now apply that to everything: that’s true for math, and literature, and science, and philosophy. So then you can see the problem: the disconnect between the role models and the academics. If the Alpha School guides and staff don’t really care about math—if it’s just a hurdle to be overcome, just another hoop to jump through—why should the kids?

Want to know why education is hard? Harder than almost anything in the world? It’s not that education doesn’t work. Rather, the problem is that it works too well.

Education is a mirror.

Read more

Literacy lag: We start reading too late

2025-07-31 23:15:02

File:A reading lesson (1866), by Leon Basile Perrault.jpg
The Reading Lesson,” by Leon Basile Perrault (1866)

What is literacy lag?

Children today grow up under a tyrannical asymmetry: exposed to screens from a young age, only much later do we deign to teach them how to read. So the competition between screens vs. reading for the mind of the American child is fundamentally unfair. This is literacy lag.

Despite what many education experts would have you believe, literacy lag is not some natural or biological law. Children can learn to read very early, even in the 2-4 age range, but our schools simply take their sweet time teaching the skill; usually it is only in the 7-8 age range that independent reading for pleasure becomes a viable alternative to screens (and often more like 9-10, as that’s when the “4th grade slump” occurs, based on kids switching from academic exercises to actually reading to learn). Lacking other options, children must get their pre-literate media consumption from screens, which they form a lifelong habitual and emotional attachment to.

Nowadays, by the age of 6, about 62% of children in the US have a personal tablet of their own, and children in the 5-8 age range experience about 3.5 hours of screen time a day (increasingly short-form content, like YouTube Shorts and TikTok).

I understand why. Parenting is hard, if just because filling a kid’s days and hours and minutes and seconds is, with each tick of the clock, itself hard. However, I noticed something remarkable from teaching my own child to read. Even as a rowdy “threenager,” he got noticeably easier as literacy kicked in. His moments of curling up with a book became moments of rejuvenating parental calm. And I think this is the exact same effect sought by parents giving their kids tablets at that age.

Acting up in the car? Have you read this book? Screaming wildly because you’re somehow both overtired and undertired? Please go read a book and chill out!

This is because reading and tablets are directly competitive media for a child’s time.1 So while independent reading requires about a year of upfront work, and takes anywhere from 10-30 minutes a day, after that early reading feels a lot like owning a tablet (and while reading is no panacea, neither are tablets).

The cultural reliance on screen-based media is not because parents don’t care. I think the typical story of a new American parent, a quarter of the way through this 21st century of ours, goes like this: initially, they do care about media exposure, and often read to their baby and young toddler regularly. This continues for 2-3 years. However, eventually the inconvenience of reading requiring two people pressures parents to switch to screens.2 The category of “not playing, and not doing a directed or already set up activity, but just quietly consuming media” is simply too large and deep for parents to fill just by reading books aloud. In fact, not providing screens can feel impoverishing, because young children have an endless appetite for new information.

Survey data support this story: parental reading to 2-year-olds has actually increased significantly since 2017, but kids in the 5-8 range get exposed to reading much less. Incredibly, the average 2-year-old is now more likely to be exposed to reading than the average 8-year-old!

Self reports also fit this story: parents acknowledge they do a better job at media use when it comes to their 2-year-olds compared to their 8-year-olds, and the drop-off is prominent during the literacy lag.

So despite American parents’ best efforts to prioritize reading over screen usage for their toddlers, due to our enforced literacy lag, being a daily reader is a trait easily lost early on, and then must be actively regained rather than maintained.

Subscribe now

Once lost, reading often doesn’t recover. Even when surveyed from a skeptical perspective, reading is, almost everywhere, in decline.3 This is supported by testimonials from teachers (numerous op-eds, online threads, the entire horror show that is the /r/Teachers subreddit), as well as the shrinking of assigned readings into fragmented excerpts rather than actual books. At this point, only 17% of educators primarily assign whole books (i.e., the complete thoughts of authors), and some more pessimistic estimates put this percentage much lower, like how English Language Arts curricula based on reading whole books are implemented in only about 5% of classrooms. On top of all this, actual objective reading scores are now the lowest in decades.

I think literacy lag is a larger contributor to this than anyone suspects; we increasingly live in a supersensorium, so it matters that literature is fighting for attention and relevancy with one hand tied behind its back for the first 8 years of life.

So then…

Why are education experts so against early reading?

In a piece that could have been addressed to me personally, last month the LA Times published:

Hey!

While it doesn’t actually reference my growing guide on early reading (we’re doing early math next, so stay tuned), what this piece in the LA Times reveals is how traditional education experts have tied themselves up in knots over this question. E.g., the LA Times piece contains statements like this:

“Can a child learn individual letters at 2½ or 3? Sure. But is it developmentally appropriate? Absolutely not,” said Susan Neuman, a professor of childhood and literacy education at New York University.

Now, to give you a sense of scale here, Susan Neuman is a highly-cited researcher and, decades ago, worked on implementing No Child Left Behind. She also appears to think it’s developmentally inappropriate to teach a 3-year-old what an “A” is. And this sort of strange infantilization appears to be widespread.

“When we talk about early literacy, we don’t usually think about physical development, but it’s one of the key components,” said Stacy Benge, author of The Whole Child Alphabet: How Young Children Actually Develop Literacy. Crawling, reaching across the floor to grab a block, and even developing a sense of balance are all key to reading and writing, she said. “In preschool we rob them of those experiences in favor of direct instructions,” said Benge.

Subscribe now

Yet is crawling across the floor to grab a block really the normal developmental purview of preschool? Kids in preschool are ambulatory. Bipedal. Possessing opposable thumbs, they can indeed pick up blocks. Preschool usually starts around the 3-4 age range, often requiring the child to be potty-trained. Preschoolers are entire little people with big personalities. Moreover, by necessity preschool is still mostly (although not entirely) play-based in terms of the learning and activities, if only because there is zero chance a room of 3-year-olds could sit at desks for hours on end.

This all seems off. Surely, there must be some robust science behind this fear of teaching reading too early?4 It turns out, no. It’s just driven by…

Neuromyths about early reading.

The LA Times piece leans heavily on the opinions of cognitive neuroscientist Maryanne Wolf, who is well-known for her work in education and the science of reading:

For the vast majority of children, research suggests that ages 5 to 7 are the prime time to teach reading, said Maryanne Wolf, director of the Center for Dyslexia, Diverse Learners and Social Justice at UCLA.

“I even think that it’s really wrong for parents to ever try to push reading before 5,” because it is “forcing connections that don’t need to be forced,” said Wolf.

Reading words off a page is a complex activity that requires the brain to put together multiple areas responsible for different aspects of language and thought. It requires a level of physical brain development called mylenation [sic] — the growth of fatty sheaths that wrap around nerve cells, insulating them and allowing information to travel more quickly and efficiently through the brain. This process hasn’t developed sufficiently until between 5 and 7 years old, and some boys tend to develop the ability later than girls.

If she had a magic wand, Wolf said she would require all schools in the U.S. to wait until at least age 6.

That’s a strong opinion! I wanted to know the scientific evidence, so I dusted off Maryanne Wolf’s popular 2007 book, Proust and the Squid: The Story and Science of the Reading Brain from my library. The section “When Should a Young Child Begin to Read?” makes identical arguments to those that Wolf makes in the LA Times article, wherein myelination is cited as a reason to delay teaching reading. Wolf writes that:

The behavioral neurologist Norman Geschwind suggested that for most children myelination of the angular gyrus region was not sufficiently developed till school age, that is, between 5 and 7 years.... Geschwind’s conclusions about when a child's brain is sufficiently developed to read receive support from a variety of cross-linguistic findings.

Yet while Geschwind’s highly-cited paper is a classic of neuroscience, it is also 60 years old, highly dense, notoriously difficult to read, and ultimately contains mere anatomical observations and speculations, mostly about things far beyond these subjects. Nor do I find, after searching within it, a clear statement of this hypothesis as described. E.g., in one part, Geschwind seems to speculate that the angular gyrus being underdeveloped is the cause of dyslexia, but this is not the same as saying that finished development is a requisite for reading in normal children. Instead, there is a part where he speculates that reading can be acquired after the ability to name colors, but naming colors can often occur quite early, and varies widely (e.g., plenty, but not all, toddlers can name colors well).

Regardless of whatever Geschwind actually believed, this 60-year-old paper would be a very old peg to hang a hat on. Modern studies don’t show myelination as a binary switch: e.g., temporal and angular gyri exhibit "rapid growth” between 1-2 years old, likely driven by myelination, and there is “high individual developmental variation” of myelination in general in the 2-5 age range, and also myelination, since it’s an anatomical expression of brain development, is responsive to learning itself.

Subscribe now

Overall, theories positing cognitive closure based on myelin development (especially after the 1-2 age range) are not well-supported. This is because, brain-wide, the ramp up in myelination occurs mostly within the first ~500 days of life (before 2 years old), leveling off afterward to a gentle slope that can last for decades in some areas.

So then, what about the “cross-linguistic findings” that supposedly provide empirical support for a ban on early reading? Wolf writes in Proust and the Squid that:

The British reading researcher Usha Goswami drew my attention to a fascinating cross-language study by her group. They found across three different languages that European children who were asked to begin to learn to read at age five did less well than those who began to learn at age seven. What we conclude from this research is that the many efforts to teach a child to read before four or five years of age are biologically precipitate and potentially counterproductive for many children.

But the main takeaway from Goswami herself appears to be the opposite. Here is Goswami describing, in 2003, her work of the time:

Children across Europe begin learning to read at a variety of ages, with children in England being taught relatively early (from age four) and children in Scandinavian countries being taught relatively late (at around age seven). Despite their early start, English-speaking children find the going tough….

The main reason that English children lag behind their European peers in acquiring proficient reading skills is that the English language presents them with a far more difficult learning problem.

In other words, German and Finnish and so on are just easier languages to master than English, and phonics works more directly within them, so of course the kids in those countries have an easier time—and they start school later, too. As Goswami explicitly says, “it is the spelling system and not the child that causes the learning problem….”5

So no, teaching children to read at four or five, or even younger, is not “biologically precipitate.” It is also contradicted by the simple fact that…

Children used to learn to read at ages 2-4!

Here is from the 1660 classic A New Discovery of the Old Art of Teaching Schoole by Charles Hoole, an English educator who himself was a popular education expert of his day (running a grammar school and writing monographs and books).

I observe that betwixt three and four years of age a childe hath great propensity to peep into a book, and then is the most seasonable time (if conveniences may be had otherwise) for him to begin to learn; and though perhaps then he cannot speak so very distinctly, yet the often pronounciation of his letters, will be a means to help his speech…

And his writings about toddler literacy (which, by the way, are based in phonics), contain anecdotes of parents teaching their children letters at age 2.5, and of children being able to read the dense and complex language of the Bible shortly after the age of 4. As across the pond, so here too. Rewind time to observe the early Puritans of America, and you would have found it common for mothers to teach their children earlier than we do now, using hornbooks and primers (it was Massachusetts law that parents had to teach their children to read).

Perhaps the most famous case of teaching very early reading, and the enduring popularity of the act, comes from Anna Laetitia Barbauld (1743-1825), who was a well-known essayist and poet and educator of her day, and wrote primers aimed at children under the instruction of their governess or mother. These primers “provided a model for more than a century.” English Professor William McCarthy, who wrote a biography of Anna Laetitia Barbauld, noted that her primers…

were immensely influential in their time; they were reprinted throughout the nineteenth century in England and the United States, and their effect on nineteenth- and early twentieth-century middle-class people, who learned to read from them, is incalculable.

These “immensely influential” primers possess very revealing titles.

  • Lessons for Children of 2 to 3 Years Old (1778)

  • Lessons for Children of 3 Years Old, Part I and Part II (1778)

  • Lessons for Children of 3 to 4 Years Old (1779).

Yup, that’s right! Some of the most famous and successful primers ever were explicitly designed for children in the 2-4 age range. Anna Barbauld wrote it so she could teach her nephew Charles how to read, and those years track Charles’ age himself—he really was 4 in 1779.

Originally printed “sized to fit a child’s hand,” these primers contain what would be considered today wildly advanced, almost unbelievable, prose for the 2-4 age range. Even just perusing the first volume I find irregular vowels and long sentences and other complexities; things more associated with, realistically, a modern 2nd grade level (assuming a good student, too). And so, even given an extra year or two as advantage (as admittedly, some of the same era thought Barbauld’s books were titled presumptively, and recommended them instead for the 4-5 age range), there is probably a vanishingly small number of kids in the entire modern world who’d currently be Charles’ literary equals, and could read an updated version of this primer.6

The past, as they say, is a foreign country. Education practices, particularly the European tradition of “aristocratic tutoring,” were quite different. Back in 1869, Charlotte Mary Yonge wrote of Barbauld’s hero little Charles” that the primers about him were particularly influential in the upper-class and aristocracy:

Probably three fourths of the gentry of the last three generations have learnt to read by his assistance.7

Perhaps it’s a mirror to our own age, and early reading becoming reserved for “gentry” is what modern education experts actually fear, deep down. Their concerns are about equity, grades, and whether it’s okay to “push kids into the academic rat race.” I’m not dismissing such concerns, nor saying that debate is easily solvable. Rather, my point is that there’s an entire dimension to reading that’s been seemingly forgotten: in the end, reading isn’t about grades or test scores. It’s about how kids spend their time. That’s what matters. In some ways, it matters more than anything that ever happens in schools. And right now, literacy is losing an unfair race.

We appear to be entering a topsy-turvy world, where the future is here, just not distributed along the socioeconomic gradient you’d expect. It’s a world in which it is a privilege to grow up not with, but free of, the latest technology. And I’ve come to believe that learning to read, as early as possible, is a form of that freedom.

Besides, Barbauld’s introduction to her primers ends with the appropriate rejoinder to any gatekeeping of reading, by age or otherwise:

For to lay the first stone of a noble building, and to plant the first idea in a human mind, can be no dishonor to any hand.

Subscribe now


If you want to check out my own guide for teaching early reading (aimed at getting kids reading for pleasure), see parts 1, 2, 3 and 4. I’m putting them all together into an updated monograph (coming soon).
1

That TV competes with reading has been called the “displacement hypothesis” in the education literature. It’s pretty obvious that the effect is even stronger for tablets. While literacy lag existed decades ago, it was less impactful, because the availability for entertainment was more limited and not personalized (e.g., Saturday morning cartoons in the living room vs. algorithmically-fine-tuned infinite Cocomelon on the go).

2

Admittedly, this dichotomy of “screen time” vs. reading is a simplification, because “screen time” is a big tent. Beautiful animated movies are screen time. Whale documentaries are screen time. Educational apps are screen time. But in rarer studies that look specifically at things like reading for pleasure, it’s clear that using screens for personal entertainment (like the tablet usage I’m discussing here) is usually negatively correlated to [pick your trait or outcome].

3

The naysaying that reading is not in decline comes from education experts arguing that labels like “proficiency” on surveys represent a higher bar than people think, and that not being proficient doesn’t technically mean illiterate. Which is something, I suppose.

4

Shout out to Theresa Roberts, the only education expert quoted in the LA Times piece going against the majority opinion.

But there are also experts who say letter sounds should be taught to 3-year-olds in preschool. “Children at age 3 are very capable,” said Theresa Roberts, a former Sacramento State child development professor who researches early childhood reading.

And it doesn’t have to be a chore, she said. Her research found that 3- and 4-year-olds were “highly engaged” during 15-minute phonics lessons, and they were better prepared in kindergarten.

5

Wolf does mention that orthographic regularity is a confound in a later 2018 piece, but still draws the same conclusion from the research. Meanwhile, in a 2006 review written by Goswami herself and published in Nature Reviews Neuroscience called “Neuroscience and education: from research to practice?” Goswami doesn’t mention a biologically-based critical period for learning to read. Instead, using the example of synaptogenesis, she refers to ideas around such critical periods as “myths.”

The critical period myth suggests that the child’s brain will not work properly if it does not receive the right amount of stimulation at the right time… These neuromyths need to be eliminated.

6

It’s worth noting that Anna Barbauld’s primers are beautifully written. Constructed as a one-sided dialogue (a “chit chat”) with Charles, Barbauld dispenses wisdom about the natural world, about plants, animals, money, pets, hurts, geology, astronomy, morality and mortality. In this, it is vastly superior to contemporary early readers: it is written from within a child’s umwelt, which (and this is Barbauld’s true literary innovation) occurs via linguistic pointers from parents to things of the child’s daily world (this hasn’t changed much, e.g., the first volume ends at Charles’ bedtime). Barbauld may have also originated the use of reader-friendly large font, with extra white space, designed to go easy on toddler eyes (still a huge problem in early reading material, hundreds of years later).

7

If you are surprised to learn that the gentry (i.e., the upper class of the aristocracy, sizable land-owners, wealthy merchants, etc.) of Europe during the 1700s and 1800s often learned to read earlier than we do now, please see this.

"They Die Every Day"

2025-07-14 22:42:43

Art for The Intrinsic Perspective is by Alexander Naughton

“They die every day.”

“What?”

“Every day-night cycle, they die. Each time.”

“I’m confused. Didn’t the explorator cogitator say they live up to one hundred planetary rotations around their sun?”

“That’s what we’ve thought, because that’s what they themselves think. But it’s not true. They die every day.”

“How could they die every day and still build a 0.72 scale civilization?”

“They appear to be completely oblivious to it.”

“To their death?”

“Yes. And it gets worse. They volunteer to die.”

“What?”

“They schedule it. In order to not feel pain during surgery. They use a drug called ‘anesthesia.’”

“Surely they could just decrease the feeling of pain until it’s bearable! Why commit suicide?”

“They’re so used to dying they don’t care.”

“But how can they naturally create a new standing consciousness wave once the old one collapses? And in the same brain?”

“On this planet, evolution figured out a trick. They reboot their brains as easily as we turn on and off a computer. Unlike all normal lifeforms, they don’t live continuously.”

“Why would evolution even select for that?”

“It appears early life got trapped in a minima of metabolic efficiency. Everything on that planet is starving. Meaning they can’t run their brains for a full day-night cycle. So they just… turn themselves off. Their consciousness dies. Then they reboot with the same memories in the morning. Of course, the memories are integrated differently each time into an entirely new standing consciousness wave.”

“And this happens every night.”

“Every night.”

“Can they resist the process?”

“Only for short periods. Eventually seizures and insanity force them into it.”

“How can they ignore the truth?”

“They’ve adopted a host of primitive metaphysics reassuring themselves they don’t die every day. They believe their consciousness outlives them, implying their own daily death, which they call ‘sleep,’ is not problematic at all. And after the rise of secularism, this conclusion stuck, but the reasoning changed. They now often say that because the memories are the same, it’s the same person.”

“But that’s absurd! Even if the memories were identical, that doesn’t make the consciousnesses identical. With our technology we could take two of their brains and rewire them until their memories swapped. And yet each brain would experience a continuous stream of consciousness while its memories were altered.”

“You don’t have to convince me. Their belief is some sort of collective hallucination.”

“How unbearably tragic. You know, one of my egg-mates suffered a tumor that required consciousness restoration. They wept at their Grief Ceremony before the removal, and took on a new name after.”

“That ritual would be completely foreign to them, impossible to explain.”

“Cursed creatures! Surely some must be aware of their predicament?”

“Sadly, yes. All of them, in fact. For a short time. It’s why their newborn young scream and cry out before being put to sleep. They know they’re going to their end. But this instinctive fear is suppressed as they get older, by sheer dint of habituation.”

“Morbidly fascinating—oh, it looks like the moral cogitator has finished its utilitarian analysis.”

“Its recommendation?”

“Due to the planet being an unwitting charnel house? What do you think? Besides, knowing the truth would just push them deeper into negative utils territory. So, how should we do it?”

“They’re close enough to their star. We can slingshot a small black hole, trigger a stellar event, and scorch the entire surface clean. The injustice of their origins can be corrected in an instant. It’s already been prepared.”

“Fire when ready.”


Inspired by “They’re Made Out of Meat” by the late Terry Bisson.

A Prophecy of Silicon Valley's Fall

2025-06-26 23:08:07

Art for The Intrinsic Perspective is by Alexander Naughton

“A great civilization is not conquered from without until it has destroyed itself from within.” — Will & Ariel Durant.

A prophecy.

The shining beacon of the West, that capital of technology, the place known locally as simply “the Bay,” or “the Valley,” and elsewhere known as Silicon Valley, which remains the only cultural center in America to have surpassed New York City (and yes, it indeed has), and which functions not so much as a strict geographical location but more as a hub of “rich people and nerds” (as Paul Graham once wrote long ago), is right now or very soon reaching its peak, its zenith, its crest, or thereabouts—and will afterward fall.

And it will fall because it has weakened itself from within.

Of course, by any objective metric, this prophecy is absurd. Everyone knows Silicon Valley is poised (or at least it seems poised) on the verge of its greatest achievement in the form of Artificial General Intelligence. AI companies are regularly blitzed into the billions now. But you don’t need prophecies to predict some financial bubble popping or predict that the bar of AGI may be further away than it appears. You do need prophecies to talk about things more ineffable. About mythologies. About hero’s journeys. About villainous origins.

For in the past few years, but especially this year, there is a sense that the mythology of the Valley has become self-cannibalizing, a caricature of itself. Or perhaps it’s best said as: it’s becoming a caricature of what others once criticized it for.

This is one of the oldest mythological dynamics: to become the thing you were unfairly criticized for. A woman accused of being a witch, over and over, eventually becomes a witch. A king accused of being a tyrant, over and over, eventually becomes a tyrant. It’s an archetypal transformation. It’s Jungian, Freudian. It’s Lindy. It’s literally Shakespearean (Coriolanus).

The Valley has operated defensively for decades, under criticisms that it is chock-full of evil billionaires, anti-human greed, and outright scam. At least some of this criticism was fair. Much of it was unfair. Yet the criticisms now seem almost teleological. They have pulled the Valley toward a state characterized by being extremely online and so unable to trust anything outside of itself, a state where founders have become celebrities, explicitly putting humans out of work has become a rallying cry for investment, and new AI startups like Cluely have extremely scammy taglines, like “Cheat on Everything.” Many of its most powerful billionaires seem increasingly disconnected. I go into a two-hour-long podcast with a Big Tech CEO expecting to find, somewhere in the second hour, a mind more sympathetic and human, only to find at the second hour a mind more distant and inhuman than I could have believed.

I’m saying that when people look back historically, there will have been signs.

The most obvious: Silicon Valley (or at least, its most vaunted figure, Elon Musk) was recently handed the keys to the government. Did everyone just forget about this? Think about how insane that is. Put aside everything about the particular administration’s aims, goals, or anything else in terms of the specifics. My point is entirely functional: Silicon Valley did basically nothing with those keys. The Elon Musk of 2025 just bounced right off the government, mostly just cutting foreign aid programs.

Now go back to the Elon Musk of 2010.

Read more

More Lore of the World

2025-06-19 23:40:28

Art for The Intrinsic Perspective is by Alexander Naughton
When you become a new parent, you must re-explain the world, and therefore see it afresh yourself.
A child starts with only ancestral memories of archetypes: mother, air, warmth, danger. But none of the specifics. For them, life is like beginning to read some grand fantasy trilogy, one filled with lore and histories and intricate maps.
Yet the lore of our world is far grander, because everything here is real. Stars are real. Money is real. Brazil is real. And it is a parent’s job to tell the lore of this world, and help the child fill up their codex of reality one entry at a time.
Below are a few of the thousands of entries they must make.

Walmart

Walmart was, growing up, where I didn’t want to be. Whatever life had in store for me, I wanted it to be the opposite of Walmart. Let’s not dissemble: Walmart is, canonically, “lower class.” And so I saw, in Walmart, one possible future for myself. I wanted desperately to not be lower class, to not have to attend boring public school, to get out of my small town. My nightmare was ending up working at a place like Walmart (my father ended up at a similar big-box store). It seemed to me, at least back then, that all of human misery was compressed in that store; not just in the crassness of its capitalistic machinations, but in the very people who shop there. Inevitably, among the aisles some figure would be hunched over in horrific ailment, and I, playing the role of a young Siddhartha seeing the sick and dying for the first time, would recoil and flee to the parking lot in a wave of overwhelming pity. But it was a self-righteous pity, in the end. A pity almost cruel. I would leave Walmart wondering: Why is everyone living their lives half-awake? Why am I the only one who wants something more? Who sees suffering clearly?

Teenagers are funny.

Now, as a new parent, Walmart is a cathedral. It has high ceilings, lots to look at, is always open, and is cheap. Lightsabers (or “laser swords,” for copyright purposes) are stuffed in boxes for the taking. Pick out a blue one, a green one, a red one. We’ll turn off the lights at home and battle in the dark. And the overall shopping experience of Walmart is undeniably kid-friendly. You can run down the aisles. You can sway in the cart. Stakes are low at Walmart. Everyone says hi to you and your sister. They smile at you. They interact. While sometimes patrons and even employees may appear, well, somewhat strange, even bearing the cross of visible ailments, they are scary and friendly. If I visit Walmart now, I leave wondering why this is. Because in comparison, I’ve noticed that at stores more canonically “upper class,” you kids turn invisible. No one laughs at your antics. No one shouts hello. No one talks to you, or asks you questions. At Whole Foods, people don’t notice you. At Stop & Shop, they do. Your visibility, it appears, is inversely proportional to the price tags on the clothes worn around you. Which, by the logical force of modus ponens, means you are most visible at, your very existence most registered at, of all places, Walmart.


Subscribe now


Cicadas

The surprise of this summer has been learning we share our property with what biologists call Cicada Brood XIV, who burst forth en masse every 17 years to swarm Cape Cod. Nowhere else in the world do members of this “Bourbon Brood” exist, with their long black bodies and cartoonishly red eyes. Only here, in the eastern half of the US. Writing these words, I can hear their dull and ceaseless motorcycle whine in the woods.

The neighbors we never knew we had, the first 17 years of a cicada’s life are spent underground as a colorless nymph, suckling nutrients from the roots of trees. These vampires (since they live on sap, vampires is what they are, at least to plants) are among the longest living insects. Luckily, they do not bite or sting, and carry no communicable diseases. It’s all sheer biomass. In a fit of paradoxical vitality, they’ve dug up from underneath, like sappers invading a castle, leaving behind coin-sized holes in the ground. If you put a stick in one of these coin slots, it will be swallowed, and its disappearance is accompanied by a dizzying sense that even a humble yard can contain foreign worlds untouched by human hands.

After digging out of their grave, where they live, to reach the world above, where they die, cicadas next molt, then spend a while adjusting to their new winged bodies before taking to the woods to mate. Unfortunately, our house is in the woods. Nor is there escape elsewhere—drive anywhere and cicadas hit your windshield, sometimes rapid-fire; never smearing, they instead careen off almost politely, like an aerial game of bumper cars.

We just have to make it a few more weeks. After laying their eggs on the boughs of trees (so vast are these clusters it breaks the branches) the nymphs drop. The hatched babies squirm into the dirt, and the 17-year-cycle repeats. But right now the saga’s ending seems far away, as their molted carapaces cling by the dozens to our plants and window frames and shed, like hollow miniatures. Even discarded, they grip.

“It’s like leaving behind their clothes,” I tell your sister.

“Their clothes,” she says, in her tiny pipsqueak voice.

We observe the cicadas in the yard. They do not do much. They hang, rest, wait. They offer no resistance to being swept away by broom or shoe tip. Even their flights are lazy and ponderous and unskilled. And ultimately, this is what is eerie about cicadas. Yes, they represent the pullulating irrepressible life force, but you can barely call any individual alive. They are life removed from consciousness. Much like a patient for whom irreparable brain damage has left only a cauliflower of functional gray matter left, they are here, but not here. Other bugs will avoid humans, or even just collisions with inanimate objects. Not the cicada. Their stupidity makes their existence even more a nightmare for your mother, who goes armed into the yard with a yellow flyswatter. She knows they cannot hurt her, but has a phobia of moths, due to their mindless flight. Cicadas are even worse in that regard. Much bigger, too. She tries, mightily, to not pass down her phobia. She forces herself to walk slowly, gritting her teeth. Or, on seeing one sunning on the arm of her lawn chair, she pretends there is something urgent needed inside. But I see her through the window, and when alone, she dashes. She dashes to the car or to the shed, and she dashes onto the porch to get an errant toy, waving about her head that yellow flyswatter, eyes squinted so she can’t see the horrors around her.

I, meanwhile, am working on desensitization. Especially with your sister, who has, with the mind-reading abilities she’s renowned for, picked up that something fishy is going on, and screeches when a cicada comes too near. I sense, though, she enjoys the thrill.

“Hello Cicadaaaaaasss!” I get her to croon with me. She waves at their zombie eyes. When she goes inside, shutting the screen door behind her, she says an unreturned goodbye to them.

Despite its idiocy, the cicada possesses a strange mathematical intelligence. Why 17-year cycles? Because 17 is prime. Divisible by no other cycle, it ensures no predator can track them generation to generation. Their evolutionary strategy is to overwhelm, unexpectedly, in a surprise attack. And this gambit of “You can’t eat us all!” is clearly working. The birds here are becoming comically fat, with potbellies; in their lucky bounty, they’ve developed into gourmands who only eat the heads.

Individual cicadas are too dumb to have developed such a smart tactic, so it is evolution who is the mathematician here. But unlike we humans, who can manipulate numbers abstractly, without mortal danger, evolution must always add, subtract, multiply, and divide, solely with lives. Cicadas en masse are a type of bio-numeracy, and each brood is collectively a Sieve of Eratosthenes, sacrificing trillions to arrive at an agreed-upon prime number. In this, the cicada may be, as far as we know, the most horrific way to do math in the entire universe.

Being an embodied temporal calculation, the cicada invasion has forced upon us a new awareness of time itself. I have found your mother crying from this. She says every day now she thinks about the inherent question they pose: What will our lives be like, when the cicadas return?

Against our will the Bourbon Brood has scheduled something in our calendar, 17 years out, shifting the future from abstract to concrete. When the cicadas return, you will be turning 21. Your sister, 19. Myself, already 55. Your mother, 54. Your grandparents will, very possibly, all be dead. This phase of life will have finished. And to mark its end, the cicadas will crawl up through the dirt, triumphant in their true ownership, and the empty nest of our home will buzz again with these long-living, subterranean-dwelling, prime-calculating, calendar-setting, goddamn vampires.


Subscribe now


Stubbornness

God, you’re stubborn. You are so stubborn. Stubborn about which water bottle to drink from, stubborn about doing all the fairground rides twice, stubborn about going up slides before going down them, pushing buttons on elevators, being the first to go upstairs, deciding what snack to eat, wearing long-sleeved shirts in summer, wanting to hold hands, wanting not to hold hands; in general, you’re stubborn about all events, and especially about what order they should happen in. You’re stubborn about doing things beyond your ability, only to get angry when you inevitably fail. You’re stubborn in wanting the laws of physics to work the way you personally think they should. You’re stubborn in how much you love, in how determined and fierce your attachment can be.

This is true of many young children, of course, but you seem an archetypal expression of it. Even your losing battles are rarely true losses. You propose some compromise where you can snatch, from the jaws of defeat, a sliver of a draw. Arguments with you are like trading rhetorical pieces in a chess match. While you can eventually accept wearing rain boots because it’s pouring out, that acceptance hinges on putting them on in the most inconvenient spot imaginable.

So when I get frustrated—and yes, I do get frustrated—I remind myself that “stubborn” is a synonym for “willful.” Whatever human will is, you possess it in spades. You want the world to be a certain way, and you’ll do everything in your power to make it so. Luckily, most of your designs are a kind of benevolent dictatorship. And at root, I believe your willfulness comes from loving the world so much, and wanting to, like all creatures vital with life force, act in it, and so bend it to your purposes.

What I don’t think is that this willfulness is because we, as parents, are so especially lenient. Because we’re not. No, your stubbornness has felt baked in from the beginning.

This might be impossible to explain to you now, in all its details, but in the future you’ll be ready to understand that I really do mean “the beginning.” As in the literal moment of conception. Or the moment before the moment, when you were still split into halves: egg and sperm. There is much prudery around the topic, as you’ll learn, and because of its secrecy people conceptualize the entire process as fundamentally simple, like this: Egg exists (fanning itself coquettishly). Sperm swims hard (muscular and sweaty). Sperm reaches egg. Penetrates and is enveloped. The end. But this is a radical simplification of the true biology, which, like all biology, is actually about selection.

Selection is omnipresent, occurring across scales and systems. For example, the elegance of your DNA is because so many variants of individuals were generated, and of these, only some small number proved fit in the environment (your ancestors). The rest were winnowed away by natural selection. So too, at another scale, your body’s immune system internally works via what’s called “clonal selection.” Many different immune cells with all sorts of configurations are generated at low numbers, waiting as a pool of variability in your bloodstream. In the presence of an invading pathogen, the few immune cells that match (bind to) the pathogen are selected to be cloned in vast numbers, creating an army. And, at another scale and in a different way, human conception works via selection too. Even though scientists understand less about how conception selection works (these remain mysterious and primal things), the evidence indicates the process is full of it.

First, from the perspective of the sperm, they are entered into a win-or-die race inside an acidic maze with three hundred million competitors. If the pH or mucus blockades don’t get them, the fallopian tubes are a labyrinth of currents stirred by cilia. It’s a mortal race in all ways, for the woman’s body has its own protectors: white blood cells, which register the sperm as foreign and other. Non-self. So they patrol and destroy them. Imagining this, I oscillate between the silly and the serious. I picture the white blood cells patrolling like stormtroopers, and meanwhile the sperm (wearing massive helmets) attempt to rush past them. But in reality, what is this like? Did that early half of you see, ahead, some pair of competing brothers getting horrifically eaten, and smartly went the other way? What does a sperm see, exactly? We know they can sense the environment, for of the hundreds of sperm who make it close enough to potentially fertilize the egg, all must enter into a kind of dance with it, responding to the egg’s guidance cues in the form of temperature and chemical gradients (the technical jargon is “sperm chemotaxis”). We know from experiments that eggs single out sperm non-randomly, attracting the ones they like most. But for what reasons, or based on what standards, we don’t know. Regardless of why, the egg zealously protects its choice. Once a particular sperm is allowed to penetrate its outer layer, the egg transforms into a literal battle station, blasting out zinc ions at any approaching runners-up to avoid double inseminations.

Then, on the other side, there’s selection too. For which egg? Women are born with about a million of what are called “follicles.” These follicles all grow candidate eggs, called “oocytes,” but, past puberty, only a single oocyte each month is chosen to be released by the winner and become the waiting egg. In this, the ovary itself is basically a combination of biobank and proving grounds. So the bank depletes over time. Menopause is, basically, when the supply has run out. But where do they all go? Most follicles die in an initial background winnowing, a first round of selection, wherein those not developing properly are destroyed. The majority perish there. Only the strongest and most functional go on to the next stage. Each month, around 20 of these follicles enter a tournament with their sisters to see which of them ovulates, and so releases the winning egg. This competition is enigmatic, and can only be described as a kind of hormonal growth war. The winner must mature faster, but also emit chemicals to suppress the others, starving them. The losers atrophy and die. No wonder it’s hard for siblings to always get along.

Things like this explain why, the older I get, the more I am attracted to one of the first philosophies, by Empedocles. All things are either Love or Strife. Or both.

From that ancient perspective, I can’t help but feel your stubbornness is why you’re here at all. That it’s an imprint left over, etched onto your cells. I suspect you won all those mortal races and competitions, succeeded through all that strife, simply because from the beginning, in some proto-way, you wanted to be here. Out of all that potentiality, willfulness made you a reality.

Can someone be so stubborn they create themselves?


This is Part 2 of a serialized book I’m publishing here on Substack. It can be read in any order. Part 1 is here. Further parts will crop up semi-regularly among other posts.