fbpx

Imitate, then Innovate

Imitate, then Innovate is my motto for improving at any skill. 

It’s counterintuitive, but the more we imitate others, the faster we can discover our unique style. In the entertainment world, there’s a long lineage of comedians who tried to copy each other, failed, and became great themselves: Johnny Carson tried to copy Jack Benny, but failed and won six Emmy awards. Then, David Letterman tried to copy Johnny Carson, but failed and became one of America’s great television hosts.

Reflecting on his own influences, Conan O’Brien said: “It is our failure to become our perceived ideal that ultimately defines us and makes us unique.” 

Modern creators do the opposite though. They refuse to imitate others and stubbornly insist on originality, which they hold as their highest virtue — even when it comes at the expense of quality. They might deny their ambition toward originality when you talk to them, but they reveal it in their actions. In general, creators spend much less time imitating their heroes than they do trying to make something new. I call it the Originality Disease — a pervasive plague that makes creators feel scared to imitate other people’s styles.  

The problem may be worst among writers, who speak about their craft with levels of mystery that are usually reserved for the numinous. Writers would be smart to learn from other fields, though. 


Quentin Tarantino

Hollywood film directors come to mind because they’re seen as the essence of what creative professionalism looks like. When people look at Quentin Tarantino, they see a mad creative with a singular talent for making original movies. But Tarantino’s originality begins with imitation. He’s famous for replicating and building upon scenes from other movies, and he once said: “I steal from every single movie ever made.” 

Looking at Tarantino’s work, I revel in the paradox that imitation and innovation are not opposed, but operate in tandem. 

I don’t know about you, but I’m a “sit back, grab some popcorn, and enjoy the movie” kinda guy. Movies are pure entertainment for me. A chance to escape the world of responsibilities and enter the trance of a captivating story. I thought everybody was like this until I watched a film with a director who was the total opposite. He was attentive to all the tiny details, from the way the musical score enhanced the film’s emotional journey, to the way light moved across the actors faces, to the way camera movements foreshadowed upcoming plot developments.

Listening to him reflect on the film, I had to ask: “Did we even watch the same movie?” I felt like I was stuck in flatland, while he lived in four-dimensional space.1

1

A director friend tells me that in Joker, the musical score helps us empathize with the Arthur Fleck who would eventually become the Joker. The composer Hildur Guðnadóttir used the cello in the opening scenes to create empathy with the protagonist. But then, as the Joker’s dark side and inner turmoil was revealed, the orchestra got louder and louder. The angrier he was, the bigger the orchestra became. Through it all, the music created the audience’s perception of him: simple, naive, and uncool.

From him, I learned that creators consume art differently than consumers. They’re far more intentional in what they consume. Consuming art is productive work for them. Directors watch movies not just to be entertained, but also to see how they’re made. Consciously or not, they’re developing their own mental Pinterest board of ideas to borrow and build upon in their own work. 

George Lucas comes to mind here. To create Star Wars, he went back to the teachings of Joseph Cambell, who spent his career studying mythology and religion. Through his writings, he laid out a theory of the archetypal hero which shows up in all kinds of stories throughout history. Today, it’s known as “The Hero’s Journey.” Chances are, you’re familiar with it. While writing Star Wars, Lucas drew inspiration from Campbell’s most famous book: The Hero with a Thousand Faces. Lucas felt the triad of mythology, folklore, and fairy tales had disappeared in the West—and he wanted his new film to revive it. In order to align the story with the classical motifs that’d reverberated through so many human cultures, Lucas re-wrote his draft of Star Wars in order to align it with Cambell’s work. 

Lucas’ artistic originality was enhanced by an imitative respect of Campbell’s work and the recurring themes he discovered. Had Lucas suffered from the Originality Disease that plagues so many contemporary writers, Star Wars wouldn’t be what it is today. 


Originality Disease

Where does this Originality Disease come from? 

I have three explanations. 

The first is pretty clear: misunderstanding inspiration. Some of the juiciest inspiration comes from admiring (and maybe even reverse-engineering) other people’s work. But many people think inspiration needs to strike out of thin air, like a bolt of lightning. They fear that the Muses of novelty won’t visit them if their mind is contaminated with what’s been done before. In blind pursuit of originality, they avoid studying anything that’s come before them out of a fear of tainting their minds with the stain of influence. Rather than standing on the shoulders of people who’ve come before them, they look within themselves for a breakthrough idea.

The second is more subtle: fetishizing originality. I think this part of the disease comes from academia, where people do study those who’ve come before them—but only so they can do something different. Since scholarly journals insist on original contributions, academics are incentivized to study things nobody else is studying. The challenge, though, is that originality and usefulness are not the same thing. I worry that academics are so focused on checking the “nobody’s ever written about this before” box that they sometimes forget to make useful contributions to human knowledge. 

The third is pure conjecture: self-obsession. Perhaps our Originality Disease has its roots in Freud’s work, which still underpins our model of human psychology. To the extent that ideas like the ego and the subconscious seem trivial, it’s only because they’ve been so influential. Freud’s ideas basically went viral, and as they did, made their way to Salvador Dalí who led Europe’s surrealist painting movement. Instead of trying to capture reality like the Realists or interpret it like the Impressionists, the Surrealists went inwards and painted the landscape of their own consciousness. They rejected logic and reason in favor of dream-inspired visions. 

Then, the psychedelic movement of the 1960s may have entrenched us even deeper inside the internal world. As the twin movements of self-love and free expression grew, people rejected the authoritative declarations of Christianity and, in the terrifying wake of World War II, anything that smelled like orthodoxy. Instead of looking outward for answers, people turned inwards. This trend has only accelerated with the rise of therapy and meditation—fields in which many advocates insist that the self knows best and the answers are within us. We think of ourselves as tiny little islands. This is a uniquely contemporary mindset. As the pastor at a local church said to me recently, the early Christians believed that our lives were porous. Rather than buffering themselves from exterior influences, they welcomed the divine in their lives and let it grow in them like a lush springtime flower.

It is said that art reflects the spirit of the times. Perhaps the thinking of our secular age has infiltrated our creative mindset too. In the world of art, our paintings are increasingly abstract in order to reflect the subjective experience of the artists who made them. So often, they aim to capture attention with the originality of shock value instead of quality. This narcissistic self-worship has people rejecting the canon. As we’ve turned inwards for inspiration, we’ve turned originality into our cardinal virtue. 

The alternative is a pursuit towards truth. In the words of C.S. Lewis, who is famous for the vivid imagination he presented in stories like the Chronicles of Narnia: “No man who cares about originality will ever be original. It’s the man who’s only thinking about doing a good job or telling the truth who becomes really original—and doesn’t notice it.”

Lewis’ words align with the premise of Roger Scruton’s excellent documentary Why Beauty Matters.2 He opens it with the idea that before the 20th century, if you asked anybody about the purpose of creating art, they would have said: “To make something beautiful.” But beginning around the time of Duchamp’s urinal in 1917, the primary purpose of art shifted to shocking its viewer with originality instead of beauty. To capture their viewers’ attention, artists disturbed their viewers and violated taboos. Rather than pursuing the lofty goal of objective beauty, we’ve turned inward and the results have been terribly ugly. Scruton insists that our language, music, and manners have become increasingly offensive and self-centered too. He blames the decline in beauty on the self-centeredness of modernity and a nonsensical break from tradition.

Duchamp's Urinal - To capture their viewers’ attention, artists disturbed their viewers and violated taboos.
Fountain 1917, replica 1964 Marcel Duchamp 1887-1968 Purchased with assistance from the Friends of the Tate Gallery 1999 http://www.tate.org.uk/art/work/T07573

Don’t think I’m advocating for stasis though. I like watching humanity innovate, and valuing tradition too much can limit progress. Here, the artists of ancient Egypt come to mind. They trained to write in a rigid script, so their apprenticeships were only complete once they could cut images and symbols clearly in stone, according to the established rules. As Ernst Gombrich wrote in The Story of Art: “No one wanted anything different, no one asked him to be ‘original’. On the contrary, he was probably considered the best artist who could make his statues most like the admired monuments of the past.” Driven by a philosophy of imitation without innovation, Egyptian art didn’t change much. A thousand years later, new works that closely resembled the ancestral ones were praised equally. There was little change, little progress, and little development. It was stagnant — imitation without innovation.

Doing the opposite of the Egyptians doesn’t work either. Innovation without imitation is a fool’s strategy. Just look at the historical examples. Einstein’s paradigm-shifting invention of general relativity was enabled by decades of studying classical physicists, whose ideas he later built upon. Many of the most original musicians have spent hours practicing scales in order to pick up on the creative powers of those they admire. In the world of writing, Hunter S. Thompson once hand-wrote every word of The Great Gatsby so he could feel what it’s like to write a great novel. Robert Louis Stevenson meticulously copied paragraphs he enjoyed, and once he got familiar with them, he threw the books to the other side of the room to force himself to rewrite paragraphs from memory.

These days, writers are hesitant to promote these imitative strategies. Perhaps their aversion comes from how gung-ho schools are about the dreaded P-Word: plagiarism. The fear of plagiarism is injected into us in school, where we’re taught to fear anything that smells like imitation. Plagiarism was so heavily punished that I imagined it like the electric chainsaw that promised instant expulsion. 

Of course, plagiarism is wrong. The problem is that our tormented fear of plagiarism has clenched its claws around the things that are actually good for you. Out of excessive trepidation, we’ve lost touch with the subtle, but important distinction between stealing other people’s work without giving them credit (which is obviously a bad thing) and mirroring the style or values of a writer you admire (which should be praised and promoted).


What Happened to Imitation? 

The etymology of the word “imitate” is one of my favorites. During the time of Shakespeare, the word “ape” meant both “primate” and “imitate.” Perhaps the etymology indicates that knowledge of imitation is core to who we are.3

2

That’s one reason why Christianity, the world’s largest religion, tells us to imitate another human being: Jesus Christ.

Throughout human history, most imitative learning happened through apprenticeships. Leonardo da Vinci comes to mind. He had almost no formal schooling, but secured his first apprenticeship at the age of 14. Time in his master’s studio led him to study math, anatomy, drawing techniques, and the beauty of geometry. In true “Imitate, then Innovate” fashion, there were obvious similarities between what da Vinci observed as an apprentice and what he’d later produce on his own. In fact, one of his master’s most famous sculptures was of the young warrior David standing in triumph over Goliath. (In fact, some scholars think their master-apprentice relationship was so close that da Vinci posed for Verrocchio’s David). Given that, we shouldn’t be surprised that it resembled the one da Vinci painted later in his career and likely contributed to his fascination with Michelangelo’s statue of David

Head from the statue of David

Today, things have changed. We’ve dropped apprenticeships in the name of efficiency. Instead of doing an apprenticeship, wannabe Da Vincis are training at professional art schools. With the decline of apprenticeships came the decline of imitative learning. The twin rise of the printing press and, later, mass schooling, led us to disproportionately value knowledge that could be codified in textbooks. What humanity gained in its ability to scale the transmission of facts it lost in its disproportionate focus on ideas that could travel in textbooks. The transmission of technique and tacit knowledge was lost in translation.

To see what I mean, look at writing education. English class teaches you how to be a good editor because it’s the easiest thing to systematize and communicate in textbooks. That’s why we learn grammar and syntax. Though they’re useful, they don’t have much to say about the deeper but indescribable aspects of writing: idea generation, how to recognize the shape of an interesting argument, honing your style, overcoming the fear of criticism, and fighting writer’s block. Common pieces of advice like “get rid of the passive voice” are shallow, low-leverage points of instruction compared to the real, but hard-to-describe benefits of creating your own vocabulary or becoming a more observant person — both of which are better learned through imitation. 

What kinds of skills are best trained through imitation? 

The harder it is to put the core knowledge into words, the more a skill should be developed through imitative learning. Often, these skills have a bunch of subtleties that are best learned in conversation with a master, or by watching them do their work. Cooking is the ultimate example. Though I’ve bought a few cookbooks from my favorite steakhouses, the charr on the outside of my Ribeyes are never quite as perfect and the meat is never quite as juicy. Following a recipe can make you a great dinner host, but it won’t turn you into Gordon Ramsay. Though every chef should know the basic science of food preparation — such as the four elements of good cooking (salt, fat, acid, heat) and the five taste buds (sweet, sour, salty, bitter, and umami) — the highest levels of cooking are driven by supple rules of thumb that are best acquired through the apprenticeship model that every high-end chef goes through. Knowing the science of cooking can take chefs to the top 10% of their craft, but knowing the art takes them to the top 1%. This is because rigid frameworks are too strict for the contoured nature of reality in the kitchen. 

Creative work is similar. The difference though is that you can imitate the end product of creative work in a way you can’t do with cooking. This is why reading a lot of good writing is among the best ways to become a good writer. Even if the principles of effective writing are hard to communicate, reading a lot hones your intuition for what quality writing feels like. 

For knowing when to embrace imitative learning, I’ll throw one more principle into the pot: the more you’re drawn to learning the skill on YouTube, the more it’ll benefit from imitative learning. “YouTube skills” tend to be more bodily than intellectual. They’re hard to describe in words. Nobody learns to dance by reading a textbook. Instead they watch videos they can copy, emulate the people they see moving on their screen, and if they’re serious, work with a coach who can provide instant feedback. Likewise, people often say that books about writing are categorically bad — and they’re right. It’s not because the writers are bad, but because when you try to distill the deepest parts of writing into text, the ideas become so reductive that they stop being useful. 

Improving creative education begins with retrieving the benefits of apprenticeships. When you imitate somebody’s work, you’re forced to think about why they made the decisions they made. Through consumption and creation, you weave the threads of other people’s work into a tapestry of your own. 


Lessons from Painting

They say that social situations will reveal who you are because personality is relative and you don’t have anybody to compare yourself to when you’re alone. I like how the poet John O’Donohue put it when he wrote: “In the presence of the other, you begin to see who you are in how they reflect you back to yourself.” 

Imitation helps us discover our creative personalities because it reveals our taste and which parts of the creative process come most naturally to us. This is what writers mean when they say they’re trying to find their “voice.” 

Through imitation, you can create your own apprenticeship. I know a painting coach who tells her students to listen for resistance in the imitation process. She says that your authentic artistic voice shines in the delta between your own style and the style of the painter you want to emulate. 

One of my favorite parts of visiting art museums is watching up-and-coming artists sketch in the gallery rooms. I always try to talk to a few of them because the act of imitation makes them think so deeply about the art before their eyes. Through conversation, I can pick up their observations and integrate it into my memory bank of knowledge. During a trip to the British Museum, I once met an aspiring painter named Finley. He was sketching some ancient Greek statues, and aimed to magnify the sense of movement in them. With each black line and each smudge of shadow, he was forced to consider why the sculptor shaped the marble in the way he did. Though these statues were centuries old, Finley was engaged in a spirited conversation with the ghost of this bygone craftsman.  

It’s no coincidence that many professional writers are trained in painting. Learning to see as a painter is among the best things you can do to become a more articulate writer. Both skills require a keen sense of observation. Where painters aim to illuminate the world with color and shadow, writers aim to illuminate it with words and metaphors. Both activities are acts of composition and sometimes, the selective withholding of information. Writing and painting have similar essential properties that manifest themselves in wildly different ways — like different parts of the electromagnetic spectrum. Likewise, visible light, X-rays, ultraviolet, and infrared light waves are simply different ways of capturing the same reality. 

The most surprising part of sketching is seeing all the little details in a scene that only occur to you after you’ve been observing it for a few hours. To write and to paint is to learn how to see. Maybe that’s why David McCullough, a trained painter and arguably America’s greatest biographer once said: “Insight comes, more often than not, from looking at what’s been on the table all along, in front of everybody, rather than discovering something new… That’s Dickens’ great admonition to all writers, ‘Make me see.’” 

If painters get so much out of imitating work that resonates with them, maybe writers should take a page out of their playbook and do the same. 


The Two Kinds of Imitation

There are two kinds of imitation: near imitation and far imitation. 

When most people think of imitation, they think of near imitation. This is when you imitate people who do similar work to you. It’s what Hunter S. Thompson was doing when he rewrote every word of The Great Gatsby, it’s what musicians do when they practice their scales, and it’s what Kobe Bryant was doing when he studied and adopted the moves of history’s greatest basketball players. In fact, Kobe once said: “I seriously have stolen all my moves from the greatest players.” 

But far imitations—transfering ideas from one domain to another—can be just as useful.4 Sigmund Freud’s Interpretation of Dreams is known as one of the most original works of psychology ever created, but most people aren’t aware of how much he pulled from Nietzche. Concepts like repression, instinctual drives, the unconscious mind, and the symbolism of dreams are rooted in Nietzche’s work.

3

The discipline of philosophy has an ethos of studying the canon. Just about every serious philosopher spends a decade reading the core works of philosophy, and many of them spend years interpreting the work of a single scholar they admire.

The premise of my essay about Peter Thiel is that his investment approach is really the practical application of Rene Girard’s philosophy. You can do something similar. Like Thiel, you can be innovative in what you choose to imitate. As for where you should look, here’s a clue: Much of the future originates in art before it becomes our reality. This is why the world of technology holds a close ear to the world of science fiction. Steve Jobs famously pulled from Star Trek to design the iPad, and early concepts for FaceTime appeared in 2001: A Space Odyssey. 

Sometimes it isn’t so simple though. As a general rule, the closer you are to the frontier, the more your intellectual breakthroughs will come from far imitations. The best intellectual breakthroughs I’ve had for the cutting edge of education have come from people outside my industry. I’ve looked as far as live electronic music, the history of Christianity, and Alcoholics Anonymous. And it isn’t just me. One of the people I admire most in the education space spent years studying cults to codify how they create such a strong sense of mission and fraternity. Likewise, while many experts doubted the possibility of manned heavier-than-air flight, the Wright brothers studied a book called Animal Locomotion and Etienne-Jules Marey’s Bird in Flight image. Unlike their contemporaries, they weren’t so exclusively focused on what’d been published by people who were working on similar projects.5

4

Until this point, photographs only captured a single moment in time. But to solve the problem of capturing movement, Marey designed a process for taking multiple photos in succession and putting them all on a single page. With the photos, people could appreciate motion in ways that were invisible to the naked eye. In accordance with our Imitate, then Imitate theme, the photo was original because it solved a pressing problem — not because Marey set out to create an original image.

Etienne-Jules Marey designed a process for taking multiple photos in succession and putting them all on a single page.
Etienne-Jules Marey’s Bird in Flight (Source)

Here, I’d like to emphasize an important point. Imitation doesn’t mean you should become just like everybody (or even somebody) else. When people conflate copying for imitation, we end up with a homogeneity of style that robs society of dynamic individualism. These days, patterns of blind imitation are most evident in the design industry where it seems like every Fortune 500 company is using the same bland caricature drawings.6 

5

A friend tells me that these drawings are popular because they allow anybody from any demographic to see themselves in the avatar. Since they’re abstract, it’s easier for people to see themselves in these avatars than a traditional model photo.

These cartoon drawings are everywhere.

The problem gets worse in tightly networked environments, when people make art to impress their peers. When every up-and-comer in an industry goes through the same insular professional studies program, they tend to adopt the same general style. What we gain in scalable education, we lose in unique output. In a piece called How the MFA Swallowed Literature, Erik Hoel argues that contemporary novels have the same beliefs about how society should function and the same minimalist style. Contrast them with history’s best writers, many of whom weren’t classically trained. William Faulkner didn’t finish high school, Dostoevsky had an engineering degree, and Virginia Woolf was mostly homeschooled.

The fields of design and writing point towards the same truth: if we want to be innovative, we should be less imitative in who we choose to imitate. 

With social media, the effects of imitation are even more pernicious. The incentives of every platform create a homogenous creative style. Twitter rewards short sentences and bold, aggressive statements. YouTube rewards fast-paced and high-energy videos that are nothing like your average movie. Off the Internet, the abundance of audio and video content is reducing the diversity of regional accents. One study of Texans found that as recently as the 1980s, 80% of people interviewed spoke with a Texas twang. Today, that number’s fallen to a third. The authors blame our media environment: “The uniquely Texas manner of speech is being displaced and modified by General American English, the generic, Midwestern dialect often heard on television.”

An easy way to improve who you imitate is to escape the Never-Ending Now by diversifying your inputs, escaping your industry, and reading more about history and less about the present. 

The innovators I’ve learned the most from combine near imitations with far ones. Patrick Collison comes to mind. I once asked him why, if Stripe is so innovative, they haven’t experimented with a radically different kind of corporate structure. He told me that companies shouldn’t be innovative for the sake of being innovative. Sometimes, it’s best to take the standard route so you can learn from a lineage of resources whenever you run into a problem. But imitating the best practices for the majority of what you do doesn’t mean you can’t innovate when you need to. Stripe has a famously transparent way of sharing information internally, launched a publishing company, a climate initiative, and was one of the first big tech companies to go all-in on remote work.7

6

Beyond Stripe, on the science front, Patrick told me that he and the other Arc cofounders were comfortable launching Arc Institute only once they’d studied the major biomedical institutions —LMB, Broad, HHMI, Allen, CZB, and major research universities.

In summation, Patrick said: “Stripe’s orientation is probably a combination of Chesterton’s fence instincts, a disinclination from being original absent strong arguments, and a curiosity about past solutions.”

Silicon Valley is a fairly ahistorical place. For how well the techies speak about current events and global history, they’re remarkably unaware of the history of their own industry. It’s as if they’re so focused on their future that they ignore their past. Patrick bucks that trend with his interest in early Silicon Valley institutions like ARPA and Xerox Parc, and industry pioneers like Alan Kay, Stewart Brand, and Douglas Englebart. Speaking to a Stanford class about Englebart, Patrick said: “So much of this early work is better than what we have today… Sure, we have solved all kinds of scaling issues, deployments, technology — but the core problems of helping people work together were thought about more back then.” 

Outside of the technology industry, he’s studied the dynamics of hyper-productive intellectual cultures such as the Scottish Enlightenment which birthed a cadre of fierce scholars such as Adam Smith and David Hume, and the Renaissance — the entirety of which was centered in a single city and funded with the rough equivalent of $500 million in today’s dollars. In the grand scheme of things, that’s not a lot of money. It’s less than 10% of Stanford’s annual operating budget and roughly 5% of San Francisco’s. 

Deconstructing the dynamics of these world-changing cultures initiates the “Imitate, then Innovate” method. Weirdly enough, Stripe is an original company because its founder doesn’t insist so much on being original. 


How to Pursue Originality

Observing Stripe has reminded me that originality is only useful insofar as it serves a higher end. In business, a lack of originality hints that you don’t understand the problem you’re trying to solve well enough. Once you understand a problem deeply enough, the solution becomes fairly obvious. The same industries that look like neat and organized machinery from the outside are actually duct-taped together. They’re riddled with all kinds of issues that are waiting for an entrepreneur to swoop in like Spider-Man and solve. 

Writing is the same way. Through teaching, I’ve discovered that the surest sign of an amateur writer is somebody who values originality as their ultimate goal — when they should value quality, beauty, or clear communication instead.

As it turns out, getting the Originality Disease isn’t a good way to become original. Rarely is a creator so stuck as when they feel like their ideas aren’t original enough. The mere presence of this thought can send them into a downward inner spiral because of the way pursuing originality too directly can lead to its exact opposite. Those who hold originality as their highest virtue are bound to either get stuck or create nothing of substance. 

Better to imitate, then innovate instead.


Thanks to Ellen Fishbein, Patrick Collison, Pranav Mutatkar, Johnathan Bi, Tiago Forte, Michael Dean, Tyler Cowen, and Will Mannon for the conversations that shaped this essay.