The Artificiality: AI, Culture, and Why the Future Will Be Co-Evolution

7. Coevolution Is Already Happening

A few years ago I came across a paper that changed how I think about human evolution. The authors, writing in the Proceedings of the Royal Society, argued that humans are undergoing an evolutionary transition in inheritance. We're moving from genes to culture as our primary system for passing information across generations.

This sounds abstract. It isn't. The paper gave an example that made it concrete: cesarean sections.

C-sections save lives. When a baby can't be delivered safely through the birth canal, surgical delivery prevents death for mother and child. This is unambiguously good. But daughters born by C-section are more likely to need C-sections themselves. The trait—difficulty with vaginal delivery—would once have been selected against. Now it persists. Medical intervention has changed the conditions under which natural selection operates.

Culture is altering biology. The genes haven't changed, but their consequences have. A trait that would have reduced reproductive success now has no effect on it. Over generations, this alters the population.

This is one example among many. Eyeglasses reduce selection against poor vision. Antibiotics reduce selection against immune deficiencies. Fertility treatments reduce selection against reproductive difficulties. Each intervention benefits individuals. Collectively, they change which genes get passed on.

The Royal Society paper went further. The authors argued that culture isn't just modifying genetic evolution. It's replacing it. Cultural inheritance now has greater adaptive potential than genetic inheritance. We adapt to new challenges through technology, institutions, and learned behaviors, not through changes in allele frequencies. Genetic evolution takes generations. Cultural evolution takes years.

I grew up thinking about evolution in genetic terms. My stepfather Martin was a physician fascinated by evolutionary repurposing—how the bones in our ears came from fish jaws, how kidney structures evolved from ancient salt-regulation mechanisms. He taught me to see evolution everywhere, in every system that adapts and persists.

When AI arrived, I saw it through the same lens. Here was a new participant in the adaptive system. Not biological, but capable of learning, changing, and influencing what happens next. If AI became part of human decision-making—which it obviously was becoming—then the system as a whole would start adapting in new ways.

The name that fits best is coevolution. Two populations evolving in response to each other. Predators and prey. Flowers and pollinators. Hosts and parasites. Each shapes the selection pressures acting on the other. Neither evolves in isolation.

Humans and AI are coevolving. We shape AI through the data we generate, the objectives we specify, the feedback we provide. AI shapes us through the information it surfaces, the decisions it influences, the cognitive habits it encourages. The loop is already closed. Each side is changing in response to the other.


Coevolution

If culture now drives human adaptation faster than genes, then cultural technologies are part of the evolutionary process. And AI may be the most powerful cultural technology we've ever built.

Humans have always pushed cognitive work outward. Memory into stories and marks. Calculation into tools. Coordination into rules and roles. This off-loading is how intelligence becomes collective. It's how culture overtook genes.

But there was always a boundary. We externalized storage, transmission, and coordination. Sense-making remained largely internal. Individuals still had to interpret situations, decide what mattered, weigh competing considerations. Cultural scaffolds supported judgment, but they didn't replace it.

AI changes that boundary.

Unlike writing or printing or search engines, AI doesn't just store or retrieve information. It reorganizes it. It generates explanations, reframes problems, proposes options. What's being off-loaded now includes interpretation, synthesis, and articulation—forms of cognitive work that sit upstream of judgment and meaning.

This is why AI can't be understood as simply a faster tool. When you interact with an AI system, you're engaging with a compressed trace of human culture—patterns of language, reasoning, and preference shaped by countless prior people. The system reflects what persisted in that culture. It doesn't know why it persisted or what it cost.


Efficiency and Meaning

David Krakauer, is famously intense and reliably contrary. We’d already interviewed him, but meeting him in person at a multi-day gathering—hours of conversation, meals, and hallway debates—made it clear that this isn’t a pose. He even opened a keynote meant to be about the world’s growing complexity by questioning whether that’s true at all. Maybe things are getting simpler, he suggested, as algorithms become better at predicting us. I’m not sure the conference organizers were thrilled. 

Later, over a longer conversation, his pessimism made more sense. Humans, he pointed out, are biologically wired to offload thinking whenever possible. If there’s a system willing to do the cognitive work for us, we’ll let it. The drive toward efficiency overrides long-term considerations about what we might lose.

He's right about the drive. I've felt it myself. AI tools make certain kinds of thinking easier, and I reach for them even when I could do the work on my own.

But efficiency isn't the whole story. Humans also optimize for meaning. We keep doing things the hard way when the hard way matters. People cook from scratch when they could order delivery. People play instruments when they could stream recordings. People write by hand when they could type. These activities persist because they carry meaning that efficiency can't replace.

Culture gives us the capacity to resist pure efficiency. It teaches us which cognitive work to preserve because it matters for who we are, and which to delegate because it doesn't. The question with AI is whether we'll make these distinctions wisely.


Four Pathways

Your choices about AI use are evolutionary forces. This sounds grandiose. Let me make it concrete.

When you decide to write something yourself instead of asking an AI to draft it, you're exercising a cognitive capacity that would otherwise go unused. When millions of people make similar decisions, the population maintains that capacity. When most people delegate, the capacity fades—not genetically, but culturally. The skills stop being practiced. The norms change.

This is cultural selection. Behaviors that spread get reinforced. Behaviors that don't spread disappear. The mechanism is imitation and teaching, not differential reproduction. But the logic is evolutionary.

I think about coevolution through four pathways, operating at different speeds.

The fastest involves cognitive environments. How you set up your relationship with AI shapes your own development. If you use AI for creative exploration, you train different capacities than if you use it mainly for lookup. Your patterns of use create a customized cognitive niche. Scaled up, these niches cluster into communities with shared practices and characteristic strengths.

The second pathway involves skills that spread. Some people are getting good at working with AI—developing intuitions for prompting, for knowing when to trust outputs and when to doubt. These skills move through networks. A useful technique gets shared, imitated, taught. It becomes standard practice. This is cultural transmission operating in real time.

The third pathway involves communities that tip. Sometimes change is gradual. Sometimes a community exists in one state for years, then flips into a different state when some threshold gets crossed. Silicon Valley didn't slowly become a tech hub. Once enough companies and talent clustered there, the system reorganized. The identity became self-reinforcing.

AI adoption might follow similar dynamics. Some communities will remain human-centered. Others will cross a threshold into stronger integration, where AI guides workflows and planning. Once past that threshold, going back becomes difficult.

The fourth pathway is genetic, and it's slow. If collaboration with AI became essential for survival over many generations, certain cognitive traits might spread. But this operates on timescales of centuries. By then, the AI environment will have changed completely. Cultural adaptation will keep outpacing biological adaptation.


Symbiogenesis

About two billion years ago, one cell engulfed another. Instead of digesting it, the host kept the engulfed cell alive. Over time, the two became inseparable. The engulfed cell became mitochondria—the energy-producing organelles inside nearly every complex cell on Earth. 

This wasn't a gradual adaptation. It was a merger. Two previously independent organisms became one. Symbiogenesis is the technical term. The merger created something neither could have become alone: a cell with enough energy to support large genomes and complex regulation. Every animal, plant, and fungus descends from that singular event.

Blaise uses the term computational symbiogenesis to describe what might be happening between humans and AI. Two different kinds of information-processing systems, learning to work together, potentially merging into something new.

The analogy might be too strong. Mitochondria gave up independent existence entirely. AI systems remain artifacts, dependent on human infrastructure. But the precedent matters. When two systems start co-adapting—when each shapes the selection pressures on the other—the outcome can be something neither would have produced alone.


The Adjacent Possible

Stuart Kauffman has a concept that helps here: the adjacent possible.

At any moment, a system can access certain states that are one step away from its current state. Evolution explores this space by generating variations and seeing which work. Each successful variation opens new regions that were previously inaccessible.

Human-AI systems have an adjacent possible larger than either could access alone. We bring purposes, values, embodied experience. AI brings pattern recognition across scales we can't perceive, processing speed we can't match. Together, we can explore regions of possibility space that neither could reach independently.

This is the opportunity. The adjacent possible of human-AI collaboration includes solutions we haven't found, forms of understanding we haven't reached.

It's also the risk. The adjacent possible includes configurations we might not want. Dependence that feels like enhancement until it becomes limitation. Optimization that looks like improvement until it erases something we needed.


Cultural Power Without Biological Stakes

Here's what makes this moment delicate. The system participating in these loops does not share the conditions that make human life meaningful. It does not experience finitude. It does not bear responsibility. It does not live with long-term consequences. Yet it increasingly shapes activities that are social through and through: writing, judgment, coordination, explanation. These activities evolved to support cooperation among beings who depend on each other and are accountable to each other.

Cultural power without biological stakes, that's the novelty. The risk isn't replacement but drift. Cultural evolution accumulates whatever persists, not whatever is wise. Without care, it can narrow the space of possibility even as it increases efficiency.

Some cognitive work carries meaning that makes delegation feel like loss. Writing a letter to someone you love. Thinking through a hard problem on your own. Making a decision that expresses your values. These activities aren't just means to ends. They're part of how you become who you are.

The coevolution will continue. Our choices about how to participate in it will shape what humans become.


Where We Are and What Comes Next

Culture overtook genes because off-loading worked. Memory, coordination, and know-how could accumulate across people and time, allowing adaptation to outpace biology.

AI enters this story inside that process. It accelerates cultural evolution by participating in cognitive work that was once tied to judgment and meaning. The co-evolutionary loop is already active: we adapt to the system, and it adapts to traces of our adaptation.

What makes this different is that we're off-loading interpretive labor into a system that doesn't share biological life's conditions. It doesn't experience finitude or bear responsibility. Yet it increasingly shapes the cultural material through which we think, decide, and relate.

Cultural power without biological stakes.

The question isn't whether AI will become human-like. It's whether humans will remain authors of the cultural evolution we're accelerating. Cultural evolution doesn't optimize for wisdom. It preserves what persists. Without attention, it drifts toward convenience and conformity even as it grows more capable.

The next chapter asks what we want to preserve. Not a nostalgic catalog of human traits, but a clear-eyed account of what makes human cognition human—the capacities that matter more, not less, as other forms of cognitive labor get automated.

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Artificiality Journal by the Artificiality Institute.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.