Creativity & AI | The Score by C. Thi Nguyen | Conversations with Chris Summerfield and Sir Geoff Mulgan
Creativity, AI, and Why Creatives Lead What Comes Next This past week we gave a talk and led a discussion
Creativity is our lifeblood. AI can recombine the past at speed. It cannot generate the new. From slop and squeeze to sovereignty—how we see it.
Dave and I recently led a discussion at Caldera, a nonprofit in Oregon that runs art and environmental programs for underserved youth from Portland and Central Oregon. Caldera's programs are free. They start in sixth grade and continue into young adulthood. Kids work with professional artists at Caldera's Blue Lake facility in the Cascades and it is gorgeous. If you don't know Caldera, go look them up—and please consider supporting their work.
What follows is an expanded version of that evening. Dave's introduction is included here as he delivered it—his framing of the Final Cut Pro story and the Walter Murch / Green Brothers parallel is his, and it sets up the argument I build on. I've extended my own remarks well beyond what we had time for on the night, pulling in research and ideas I've been developing over the past year. The discussion also included Heather Crank, an award-winning motion designer, visual artist, and generative AI practitioner based here in Bend. Heather's work has been shown at the Guggenheim, Meow Wolf, and the SUPERNOVA Animation Festival, and she brought the perspective of a working creative professional navigating the reality of AI in her practice every day. Several of her observations have made their way into this piece, and I've noted where.
Creativity is the lifeblood. We are creative because we are alive. Every living system is a sensemaking system—building models of the world, updating them, seeking novelty, making meaning from mess. This is deep. This is wired in. This is what we are.
I think about creativity very broadly. Scientists working on hard problems are doing some of the most creative work alive right now. Engineers, strategists, mathematicians—creativity runs through all of it. But here I want to focus on creativity as most people think of it. The creative arts. Visual art, music, writing, design, film. Because something is happening to creative professionals right now that matters to all of us, and I want to tell you what we're seeing.
At the Artificiality Institute, we study the human experience of AI. Over two thousand stories that decode how people work with AI. And the most creative people in our data—even the ones using AI the heaviest—are the most emphatic that creativity belongs to them. They love the tools. They use them constantly. And they will fight you if you suggest the tools are doing the creative work.
They're right. And I want to explain why.
OK so what is creativity, precisely? I have been heavily influenced by the work of cognitive scientist Margaret Boden, who died last year at the age of 88. She spent decades on this and identified three kinds. Combinational—mashing existing ideas together in new ways. AI is eating this for lunch. That's what large language models do. That's what image generators do. They recombine at scale and speed that no human can match, and a lot of the output is impressive. Exploratory—pushing the edges of a style or form, seeing how far it stretches. AI is getting better at this too, though it needs a human to tell it which edges matter. And transformational creativity—where you break the rules of a space entirely. You redraw the whole thing. AI cannot do this. AI has no space to break the rules of. It has statistical patterns derived from the past. You can't break rules you don't understand as rules.
So when people say AI is creative, they're partly right. It's very good at the first kind (combinatorial). It's passable at the second (exploratory). And it's nowhere on the third (transformational). Not yet, and I'll explain why I think that holds even as AI gets more powerful. But the third kind is where fields change. That's where culture actually moves forward. And that's the one that requires the deepest foundations.
Picasso trained classically. He could draw and paint with technical precision that took years to develop. That mastery is what made cubism reachable. He could see what the rules of representation were doing, and he could see how to break them, because he'd spent years working within them. Someone who hadn't done that foundational work couldn't get there. They wouldn't know what they were breaking or why it mattered.
OK so how do new things come into being? There's a key concept in complexity science that is essential here. It's called the adjacent possible. It is mostly used in the context of biological evolution but it is just as relevant in cultural evolution and in creativity. It says that, right now, given everything you know and everything you've made, there's a set of things that are newly reachable for you. Possibilities that exist specifically because of the work you've already done. And every time you reach one of them—every time you build a new skill or make something or solve a problem—the set of what's reachable expands. New possibilities emerge that literally weren't available before. This is cumulative. And it's specific to you, because it depends on your particular accumulation of knowledge and experience and context.
Dave and I have used hip-hop as an example of the adjacent possible for years, because the mechanism is so visible you can trace each step.1
Kool Herc was a DJ in the Bronx in 1973. He had two turntables, crates of funk and soul vinyl, and thousands of hours of listening behind him. He noticed something very specific: dancers responded most intensely during the drum break—that isolated section where the melody drops out and it's just rhythm. This observation was only available to someone standing in that place watching those dancers with that depth of knowledge about those records.
So he used two copies of the same record. One on each turntable. When the break ended on the first, he switched to the same break on the second. He could loop an eight-second break indefinitely. He called it the merry-go-round.
That technique was reachable specifically because of his situation. He had two turntables, two copies, the musical knowledge to identify which breaks hit, and direct observation of what dancers did. Nobody designed turntables for that. So he perceived an affordance the manufacturer never imagined.
Here's how the sequential unlocking works. Grandmaster Flash was in the same Bronx scene, watching what Herc did. Because Herc had normalized putting your hands on the record—something no DJ had done before—Flash could build on that. He developed his quick-mix theory, methods for punching in and out of specific segments with precision. Those techniques were reachable because of what Herc had made available.
Then Grand Wizzard Theodore comes into the story. Theodore was a kid, twelve or thirteen, learning from Flash. He was already in the practice of physically handling records during playback which was a practice that existed because of Herc, refined by Flash. One day he was playing records too loud in his bedroom and his mother started banging on the door. He held the record to pause it. His hand moved the vinyl back and forth under the needle. And he heard something. He spent days experimenting with that sound. That was scratching.
Scratching was only reachable because of everything that came before it. If Herc hadn't normalized manipulating vinyl, if Flash hadn't developed that into precise physical technique, Theodore's hand on that record is just a mistake. Instead he perceived an affordance in an accident because his hands were already trained to be on the record.
And once scratching exists, the turntable is no longer a playback device. It's an instrument. And once the turntable is an instrument, sampling becomes conceptually available—the idea that any recorded sound is raw material. And once sampling is established, production itself becomes composition.
Herc, Flash, Theodore. Each one situated in a specific context with specific knowledge. Each one perceiving affordances the previous generation made available. Each one expanding what was reachable for the person who came after. That's the adjacent possible in human creativity. A sequential expansion of what's reachable, driven by people with the foundation to perceive what was in front of them.
And this is why foundations matter so much. The more work you've done, the more you can perceive. A trained eye sees affordances that an untrained eye walks right past.
Here's why AI can't operate in this landscape on its own. AI learns from training data. Training data is the past. Every pattern, every image, every sentence in those datasets—humans already made it.
The astrobiologist Caleb Scharf describes all of human culture—every book, song, equation, recipe, every piece of graffiti—as a living body of information. He calls it the dataome. And Scharf says something that will change how you think about our information systems and by extension, AI. He says human creativity is the mitochondria of the dataome. We're the engine. We're what keeps the whole thing alive and growing.
Now I know—being compared to a cellular organelle is not the most flattering thing. But stay with this for a second because it's one of the most important framings I've encountered. It means every genuinely new idea, every transformational creative act, every time someone reaches an adjacent possible and expands what's reachable—that's new information entering the cultural record. That's the dataome growing. Without us, it stops. It's that simple.
AI feeds on the dataome. It recombines it at extraordinary speed and scale. And the recombination can be stunning. But recombination from the past is recombination from the past. New information enters the dataome from beings who are situated. Who have context. Who are standing somewhere specific with a particular history, perceiving affordances.
Cut off the source of genuinely new information and the dataome starts eating itself. Fill it with material that contains no new information—and I'll come back to this—and you're watching the cultural record degrade.
But the affordances that expand the adjacent possible are not limited to the physical world. A kid who grew up with a screen in their hand perceives affordances in digital environments that I will never perceive. The way a twenty-year-old moves through software—grabbing a function built for one purpose and repurposing it entirely—that's the same kind of perception those Bronx DJs had with turntables. The substrate is digital. The creativity is real. The dataome grows.
And AI has its own affordances that humans can't access directly. AI can hold millions of data points simultaneously and detect structural patterns across that volume. No human can do that. We already see this in mathematics, where AI recently detected a hidden structure—hypercubes—inside permutation groups that mathematicians have studied for fifty years. The structure was always there but no human could perceive it because no human can hold the relevant objects at that scale. The AI operated at that scale, generated output, and the mathematicians recognized what it meant. The AI extended what was perceivable while the mathematicians did the conceiving.
We see early versions of this in our data too. Co-authors working with AI affordances right now. The AI surfaces a pattern the human couldn't have perceived at that scale. The human recognizes it means something. The human makes meaning from it.
That's a genuine expansion of the dataome. New information entering the cultural record, coming from situated humans who are reaching further than any previous generation could because the tools are extending their perceptual range. This, I believe is the wonder of working with AI: it can extend what's perceivable, while we extend what's conceivable.
So this is what we study. How humans actually work with AI. And we've found distinct patterns. We describe them across three dimensions. We call them blending, bonding, and bending.
Blending is how much your thinking merges with the AI's output. How much you absorb its patterns, its framings, its suggestions.
Bonding is deeper. That's how much the collaboration changes your sense of identity. How much it reshapes who you are as a thinker, as a maker.
Bending is the big one. Your willingness to be genuinely changed by the collaboration. To have your sensemaking frameworks evolve. To come out of the work thinking differently than when you went in.
The people doing the strongest work with AI are high on all three. We call them co-authors. Their reasoning and their process are deeply integrated with AI. A lot of them are coders, strategists, knowledge workers. They're producing extraordinary work. And many of them are enjoying it enormously—the satisfaction of working with a powerful tool that extends what you can do is real and it's legitimate.
Two groups break the pattern. Scientists and creatives. In our data—and this is a small sample that skews toward early adopters—scientists tend to keep AI at arm's length because they don't trust it to be accurate. The hallucination problem is real. That said, this is already changing. Mathematics is moving fast toward genuine co-authorship with AI, and other sciences will follow as verification tools improve. The direction is clear even if our data captured an early snapshot.
Creatives keep AI at arm's length for a different reason. They don't trust it with their identity.
They're right. Our data shows that deep collaboration with AI does reshape how you think, what you reach for, and what you consider yours. And identity is everything in this conversation. The muscle—the skill, the practice, the years of work—that's essential. But the muscle serves the identity. Your creative identity is your accumulated sense of what matters. What's worth making. What you're trying to say. It's what determines which affordances you perceive and which ones you walk past. Herc had a specific sense of what a party should feel like, and that's why he heard raw material in a drum break that every other DJ played straight through. Picasso had a point of view about what representation was failing to do.
If AI reshapes what you reach for, if it starts to determine what feels worth making, then your adjacent possibles become its adjacent possibles. Your transformational creativity disappears—because you've lost the perspective that would tell you which rules to break and why. The muscle is still there. But it's serving someone else's sense of what matters. Or worse, no one's.
The co-authors in our data have figured this out. They've built a creative identity strong enough that they can work deeply with AI and still know what's theirs. The collaboration extends their range. The identity came first. The tool came second.
Here's what else we see. Creative professionals raised concerns about younger generations. Nobody asked them, they volunteered it. They said things like: I worry that younger kids will lose their imaginations because AI can do it all for them.
Across disciplines—visual artists, writers, musicians, designers—they independently used the same words. Crutch. Dependency. Atrophy. Wither. One said what we all instinctively know: creativity is a muscle and if I don't exercise it, it will wither.
The people who know this technology best, who use it every day, who genuinely benefit from it are telling us that creative skills have to be built first.
OK. Now let me talk about what's actually happening because I refuse to stand up here and pretend this is going well.
First,
Slop. People who have no creative foundation are using AI to produce work and settling for good enough. And there is a flood of it. The dataome is filling up with recombined material that contains no new information. Adjacent possibles are contracting because nobody is doing the foundational work that would expand them.
And slop is worse than you think. Go back to what I said at the start—we evolved to seek novelty. That drive is deep and it's wired in. When you encounter something genuinely new, your brain responds. Your attention focuses. Your engagement rises. This response evolved because novelty signals something worth investigating. It might be something dangerous—is that movement in the grass a lion or an antelope? But it also might be something that could expand what's available to you.
Slop hacks that response. AI-generated content is superficially novel but it’s informationally empty. It triggers our evolved sensitivity to the new without delivering anything actually new. It’s like the way processed food hacks our evolved appetite for sugar and fat by delivering a concentration that never existed in nature. Slop delivers a concentration of surface novelty with no informational nutrition underneath.
That's why encountering it feels like a violation. Because it is one. It's exploiting something fundamental about how we're built as sensemaking creatures. Our novelty drive exists to pull us toward genuine new information—toward adjacent possibles that expand what's available. Slop hijacks that drive and delivers nothing. The dataome doesn't grow. Our own cognitive wiring gets played. This is worse than pollution. Pollution degrades the environment but slop degrades us.
Second,
Squeeze. AI companies are moving directly into creative tools. Anthropic is aiming to replace Figma. The effect on working designers is brutal. A designer who used to do design is now expected to be copyeditor, graphic designer, brand strategist, all at once. More work. Less money. Smaller and smaller jobs. And the business development still has to happen, for diminishing returns. People are hurting. Real careers. Real livelihoods. This is what happens when the market treats creativity as output—as units of production—and forgets that creativity is a human capacity.2
Look, a lot of people are using these tools and delighting in them. They're making things. They're enjoying it. That's real. The psychologist Mihaly Csikszentmihalyi wrote about this—you can find something creative because it's new to you. That experience of personal novelty is genuine. It feels like creativity because at a personal level it is.
But Csikszentmihalyi also said you have to send your work out to be judged. The domain has to weigh in. Your peers have to look at it. Personal novelty and domain-level creativity are different scales of the same phenomenon. One of them expands the dataome. The other feels wonderful and leaves the dataome exactly where it was.
So what does being trained actually mean?
The ability to see genuine novelty. To know when something is actually new and when it's recombination wearing a new surface. That takes deep knowledge of what already exists in your field. The DJ who heard raw material in a drum break could do that because he knew thousands of records. An untrained ear hears a good beat. A trained ear perceives an affordance.
Aesthetic judgment. Is this any good? That question requires years of looking, making, failing, refining. You build that capacity through practice. There is no shortcut.
And the ability to work with contested success. Creativity is perhaps the one domain where there is no single right answer. A trained creative uses empathy, theory of mind, deep understanding of audience—the whole tradition of process art—to shape the response they want in the people experiencing their work. They are making the object and they are making the experience of the object. That's craft and it’s what takes years of practice to build.
People settling for good enough from AI are skipping all of this. They can't see genuine novelty because they don't know the field. They can't exercise aesthetic judgment because they haven't built it. And they can't navigate contested success because they've never had to hold an audience's experience and work with it.
Third,
Sovereignty. That is what is going to come next.
People make technology their own. Dave tells a story about being on the founding team of Final Cut Pro.
In 1999, Apple released Final Cut Pro bringing professional editing out of the expensive post-production suite and onto the Mac on your desk. Four years later, Walter Murch edited Cold Mountain on Final Cut Pro and received an Academy Award nomination. Perhaps the most respected editor alive validated this new technology.
But Murch was 60 when he did that. He'd been editing for over 30 years. He brought Final Cut Pro into his world—the world of feature film post-production. He used a new tool to do the thing he already knew how to do.
The real revolution for digital video came from people who were much younger—people like Hank and John Green who were only 19 and 22 when Final Cut Pro shipped. For people like them, cheap cameras, desktop software, and internet distribution didn’t relate to what existed before—they were just there. And instead of using them to make films or TV shows, the Green Brothers invented something entirely new—video-based community & educational video channels that reach hundreds of millions of people.
Murch used the new technology to do the old thing differently. The Green Brothers used it to invent something no one had imagined.
That's the pattern. Professionals stabilize the tool. The next wave makes art with it. We're in the stabilization phase right now with AI. The co-authors in our data are figuring out the workflow, getting the efficiency gains.
The younger generation will lead what comes next. Their adjacent possibles are different from ours. They grew up with digital affordances as native infrastructure. They pick up a tool and find out what it can become, the way Herc picked up turntables. And they're going to discover that AI has affordances of its own, and that a human who can make meaning from what AI surfaces is doing something no previous generation could do.
Creatives will lead this reclamation. They're the ones with the identity. The aesthetic judgment. The capacity to navigate contested success. They're the ones who can bend without breaking because they built the muscle first. They're the ones who can look at what AI surfaces and say: that's genuinely new. Or: that's recombination undercover. Or: that's interesting, and here's what I can make from it that will mean something to other humans.
And that last part—that will mean something to other humans—is everything.
Our species has a capacity no other animal has which is shared intentionality. We don't just act in the world. We convey our intentions to each other. We coordinate and build things together that none of us could build alone. Language, culture, science, art—all of it rests on this capacity to share what we mean.
Creativity is social at its root. Making something is only half the act. The other half is conveying it. Showing up with your work and saying: here's what I see. Does this mean something to you? Every creative act is an act of communication between humans. That's how we're built.
Here is my biggest concern right now. The vision of the future where every individual has infinite personalized AI-generated content—that Wall-E future where we're all consuming material made just for us, alone—that is the collapse of everything I've been talking about. Because creativity is social. Meaning is social. If we stop showing up for each other, if we stop sending our work out to be judged, if we stop holding other people's experience and working with it—then the whole thing falls apart. The dataome doesn't just stop growing, it loses its function. Information that isn't shared between humans is noise.
Sovereignty means: I decide what this tool does in my hands. My sensemaking. My adjacent possibles. My transformational leaps. The tool extends my perceptual range. I decide what it means. The tool extends what I can perceive. What I conceive—what I build from that perception, what I choose to make and why—that's mine. And the work is always, ultimately, for other people. Because that's what creativity is. It's how we convey meaning to each other.
Creativity is our lifeblood. We evolved to seek the new, to make meaning from it, and to share that meaning with each other. The slop is real. The squeeze is real. The isolation is the one that should worry us most.
And sovereignty—creative sovereignty—is how we come through it.
The hopeful version of this story—and I believe it—is that the way to a better future is with creatives in the lead while the rest of us follow.
1 and 2 Heather Crank: thank you for being so raw and honest about the realities of the creative profession right now, while also being able to convey the enormous opportunity you feel will be possible for creative professionals with AI
AI is changing how you think. Get the ideas and research to keep you the author of your own mind.