fbpx

How Philosophers Think

Philosophers are the most rigorous thinkers I know.

Like intellectual boxers; they come to understand ideas by making them fight with each other. Their style of analysis is effective because it’s so bloody. One friend calls his style “violent thinking.” He talks about thinking like a soldier talks about interrogation. He subjects ideas to ruthless torture, shaking them and grabbing them by the throat until they can no longer breathe and, eventually, reveal their true nature.

The way he dissects ideas reminds me of something the smartest kid in my middle school class used to do. On the weekends, he’d take computers apart and put them back together, so he could understand how they work. He rarely reconstructed them in the same way he dismantled them, though. For the joy of play and the pursuit of efficiency gains, he searched for new ways to reconfigure the machines. Every now and then, he’d find a performance improvement that even the designers didn’t consider. But usually, his risks didn’t pan out. Even when he reached a dead end, he always learned something about why computers are made the way they are. 

Good philosophers are like my friend from middle school. But instead of playing with computers, they play with ideas. Writing takes them a long time not because they’re finger-happy keyboard warriors, but because they rip ideas apart until they’re left with only the atomic elements. Once the idea has been sufficiently deconstructed, they put it back together. Usually, in new ways. 

That thinking process happens through writing, where we navigate the hazy labyrinth of consciousness. Most roads lead to a dead end. But every now and then, the compass of intuition leads to a revelation that the top-down planning mind would’ve never discovered. To that end, most of the time a philosopher spends writing doesn’t involve typing. Rather, it’s a form of intellectual exploration—following intellectual embryos and running into various roadblocks on their way to discovering an idea’s mature form.

The point is, you can read all the Wikipedia summaries you want, but they won’t give you a holistic understanding of an idea. That only happens once you have a layered, three-dimensional perspective, which writing helps you achieve. 

Charlie Munger calls this the difference between “real knowledge” and “chauffeur knowledge.” He tells an apocryphal story about Max Planck, who went around the world giving the same knowledge about quantum mechanics after he won the Nobel Prize. After hearing the speech multiple times, the chauffeur asked Planck if he could give the next lecture. Planck said, “Sure.” At first, the lecture went well. But afterwards, a physics professor in the audience asked a follow-up question that stumped the chauffeur. Only Max Planck, who had the background knowledge to support the ideas in the talk, could answer it. 

From the chauffeur’s story, we learn that you understand an idea not when you’ve memorized it, but when you know why its specific form was chosen over all the alternatives. Only once you’ve traveled the roads that were earnestly explored but ultimately rejected can you grasp an idea firmly and see it clearly, with all the context that supports it. 

The more pressure people feel to have an opinion on every subject, the more chauffeur knowledge there will be. In that state of intellectual insecurity, people rush to judgment. When they do, they abandon the philosophical mode of thinking. In turn, they become slaves to fashionable ideas and blind to unconscious assumptions. 


Fashionable Ideas 

Since ideas are invisible, people underestimate the extent to which they can go in and out of fashion. But ideas are like clothing. They change with the times and reveal how much the actions of others influence our decision making. And it’s not just the conformists. Even counter-cultural styles have consistent tropes. This is how culture acts as an operating system for how we perceive the world. It’s the standard we revert to when we don’t think for ourselves. We laugh at the things people in the 70s used to wear, but if we could see ideas in photos, we’d do the same for our thinking.

The faster you jump to conclusions, the more likely you are to default to fashionable thinking. People who don’t have the tools to reason independently make up their minds by adopting the opinions of prestigious people. When they do, they favor socially rewarded positions over objective accounts of reality.1 A Harvard anthropologist named Joseph Henrich laid the empirical groundwork for this idea in his book, The Secret of Our Success. In it, he showed that evolution doesn’t prioritize independent thinking. Humanity has succeeded not because of the intelligence of atomic individuals, but because we’ve learned to outsource knowledge to the tribe.

1

One study found that the people who feel the most authentic are, in fact, the more likely to betray their true nature and conform to socially approved qualities.

In one example, researchers found little difference between the brains of chimpanzees and two-and-a-half-year-olds on various subsets of mental abilities, such as working memory and information processing. Social learning is the glaring exception: humans are such prolific imitators that they even copy the stylistic movements of people they admire, even when they seem unnecessary. Most of this happens outside of conscious awareness. And they don’t just copy the actions of successful people. They copy their opinions, too. Henrich calls this “the conformist transmission” of information. All this suggests that social learning is humanity’s primary advantage over primates and, in Henrich’s words, “the secret of our success.”

But sometimes, that conformity spirals out of control. Our ideas become as ridiculous as the fashion trends of a bygone era. I suspect that the Internet has accelerated the rate at which new ideas become trendy, compounding this risk. Given that, our culture needs people who can reason independently and stand like sturdy steel beams in the winds of social change. They serve as a counterweight to those who default to socially rewarded positions, which often look appetizing on the menu of potential perspectives. Independence comes with a cost, as shown in multiple religious texts. Though Judaism and Islam both allow people to publicly deny religious faith during times of persecution, they expect believers to maintain their faith as much as possible in private. But as Duke University professor Timur Kuran has shown, people who try to maintain a secret religion for a long time usually abandon their faith. Psychologically, the burden of falsifying your beliefs in public is too heavy to shoulder. That’s when the magnet of culture pulls us in and kidnaps our beliefs. 

Along those lines, if there’s anything I’ve learned about marketing, it’s that repetition is indistinguishable from truth. The more people are exposed to an idea, the more likely they are to believe it. The more fashionable it is, the more exposure it’ll receive. But the popularity of an idea doesn’t make it correct. Like the secret menu at In-N-Out Burger, the best options aren’t always advertised. 


Unconscious Assumptions

Philosophers are trained to look for these unconscious assumptions and make them obvious.

By definition, we’re blind to what we can’t see. When looking for answers, we’re like the proverbial drunk who only looks for their keys in places where the light is shining. This Spotlight Effect distorts our thinking and limits the ideas we can discover. That’s why philosophers spend as much time studying the spotlight as the ground itself.

Jumping to conclusions limits your ability to discover the truth, because you can’t jump to conclusions outside the spotlight. Philosophers know that every idea comes packaged in an implicit frame. Practically speaking, writers like Noam Chomsky argue that the modern limits on speech are implicit, not explicit. By law, you can say just about anything. But that’s not the case in practice. There’s a frame around the range of acceptable opinions, which allows for lively debate only within that range. That’s how thought control happens. The assumptions of a culture determine the aperture of mainstream thinking.  Knowing that axioms will mold the ultimate shape of an idea, good philosophers tend to critique the premise of an idea—the frame—instead of the conclusion.

My Why You’re Christian essay demonstrates the process I use to interrogate my ideas. For years, I thought my belief in human rights had nothing to do with religion. That was my unconscious assumption. But when I studied the intellectual underpinnings of the human rights concept, I realized that I’d inherited that idea from the Bible. Only then did I see how the spotlight of my assumptions warped my worldview. And only through research and conversation did I see how those assumptions acted like an invisible cognitive prison. It wasn’t until I realized I was locked up that I could escape the walls of dogma—or start to. The scary thing was that, like a teenager who wears the same clothes as all the other kids at school, I unconsciously accepted the intellectual assumptions of my social environment. Only by spending time with orthodox Christians, who I mostly disagreed with, did I see the myopia of my worldview.

Luckily, I learned an important lesson: when you restrict yourself to one side of the intellectual spectrum, you limit your capacity to find truth.

F. Scott Fitzgerald, who wrote The Great Gatsby once said: “The test of a first-rate intelligence is the ability to hold two opposing ideas in mind at the same time and still retain the ability to function.” That’s what philosophers do so well. By oscillating between radical extremes, you put ideas at war with each other in the name of truth. By doing so, you stretch each idea to its logical conclusion. Even if you hold passionate opinions (and the best thinkers I know do), you’re better off spending time with people with intellectual grace, who aren’t afraid to challenge your thinking and reveal your unconscious assumptions.

Intellectual grace is a necessary caveat because it’s a gateway to the cognitive frontier. You can’t really be honest with people who will think you’re a bad person if you disagree with them. Fear of callous retaliation, however subtle, limits your capacity for truthful conversation. Like the writer who stumbles into all kinds of dead ends, you can’t experiment with new ideas unless you’re willing to take intellectual risks. That’s where intellectual grace comes in. Only when others are charitable and give you the benefit of the doubt can you wholly reason towards truth and explore interesting ideas that may be flawed.

Philosophy, like regular life, is best experienced with an attitude of intellectual grace. “What can this person teach me?” is a much more productive question than “How is this person wrong?” 


Sitting with the Question

It took me two years to unearth the unconscious assumptions I had about human rights. Had I stayed attached to my beliefs, I wouldn’t have discovered the truth. Given the time it took to find it, I have a name for this process of intellectual discovery: “Sitting with the question.”

These days, people leap to conclusions because they feel pressure to have an instant opinion on every topic. Whenever a new controversy bubbles up in the news, they jump on the intellectual bandwagons of people they want to be affiliated with. They favor group loyalty over independent reasoning. It’s the epitome of tribal thinking. One day, they’ve never thought about the conflict of the day; the next, they’re posting passionate screeds on Instagram like they received a PhD on the subject. Worse, their flaming infernos of polluted rage are louder than a flushing airplane toilet. Meanwhile, when I talk to actual scholars about their beliefs, they start with the premises of my question instead of the conclusions of their answer. They’re upfront about tradeoffs, too. Ultimately, they say something like: “Well, it’s complicated.”

I’m not saying that you shouldn’t have strong opinions. After all, the world progresses when people with conviction take action, often against the tide of consensus. Skepticism can also come at a price—I’ve met a number of philosophers who are so skeptical of all claims that they’ve effectively paralyzed themselves. Taken together, strong opinions are something you have to earn. You don’t earn them by accumulating every credential you can snag like a shopper on Black Friday, but rather through rigorous writing and sustained dialogue. You don’t need decades of experience to take a stand on a complicated issue, but you sure as hell need more than 24 hours.

The less time you give yourself to think, the more you’ll settle on socially rewarded points of view. The early research on this idea goes back to a psychologist named Robert Trivers, who did most of his work in the 1970s. He argued that the human brain was designed to deceive itself. We distort information to make ourselves appear better than we actually are. In his book The Folly of Fools, Trivers argues that a contradiction lies at the heart of human intelligence. Our brains are simultaneously designed to seek out information and destroy that information after we acquire it. Specifically, our minds evolved to make sense of the world not in ways that are true, but in ways that help us survive. But once all that information enters our minds, we ignore critical information and believe self-serving falsehoods. Often, the more we distort information, the more rational we think we are. And so, we applaud ourselves for clear thinking even though we see the world through a tainted prism of self-serving beliefs. Summarizing his work, Trivers once said: “We deceive ourselves the better to deceive others.”

With Trivers here and Henrich before, we see how self-deception is a structural part of human nature. Given the nature of blasphemy, people are more honest in private forums than public ones. That’s not necessarily a bad thing. Sometimes, a little bit of lying maintains social cohesion.

Only by understanding our biology can we upgrade our thinking at a societal level. As a culture, we will always condemn radical thinkers, even though we celebrate ones like Copernicus and Martin Luther (who both attacked the orthodoxies of their days) in our history books. The history of Western thought is littered with cautionary tales about the dangers of thinking like a philosopher.  


The Dangers of Thinking Like a Philosopher

On warm evenings, when I have some time to wander, I like to walk through the University of Texas campus in Austin. Like many other colleges, it’s built around a magnificent bell tower that looms large in the campus skyline. Inscribed on this one are the words: “Ye shall know the truth and the truth shall make you free.” 

Though it’s an inspiring message, it warrants a caveat. There are consequences to pursuing the truth, especially in public. Since people don’t like it when their ideas are attacked, there are social consequences to thinking outside the cultural spotlight. This is one of the oldest lessons in Western philosophy. Socrates was one of the first philosophers to get cancelled. He expressed his views openly, even though it led to accusations of impiety and moral corruption. Eventually, he was sentenced to death. Even though Plato called him “the wisest and most just of all men,” Socrates couldn’t transcend the social punishment of questioning cultural dogmas. Today, philosophers mythologize his death as a way of striving towards a culture that encourages free thinking.

Sorry, but history predicts that many of your foundational beliefs are wrong, and they’ll be proven wrong in the future. Though it’s fun to mock the ideas of people who lived before us, there’s no reason to think our grandchildren won’t laugh at us in turn. Even if history is an unceasing sprint toward the future, human nature doesn’t change. Only the laws of physics are more predictable. What’s happened in the past will happen in the future — again, again, and again. Given the chance, most people would ignore the lessons of history and punish the next Socrates.

As appealing as it sounds in theory, people are scared to think like a philosopher in practice. Social media has turned so many into public relations professionals who pursue likeability instead of truth. That’s why people speak along pre-vetted party lines and silence their edgy ideas. First to others, later to themselves. When anything you say online can be instantly accessed with a Google search, the costs of independent thinking aren’t worth the benefits to most people.

Sometimes, I wonder if the fear of offending others contributes to the popularity of abstract art. These days, it seems like every office building, conference center, and apartment complex has the same abstract art on the walls. None of it says anything. It reflects a culture of spinelessness in which people are afraid to take a stand, fearing the repercussions of saying something bold in public. Driven by this new calculus, some artists conclude that it’s best to speak without saying anything at all. In the words of one person on Twitter: “Anything with form has meaning, and therefore could invite controversy. The world we are bringing into being will exalt the talentless, the spineless, the shapeless, the meaningless.”

If history is any indicator, the social consensus has settled on all kinds of wrongheaded ideas that people are too scared to critique, especially in public. The pressure to have an opinion on every important topic has incentivized lazy thinking, the consequences of which we feel every day. Only by expanding our intellectual aperture and attacking our own ideas can we wage war on the narcotic of cheap, bumper sticker arguments. As we do, we can identify the intellectual cancers that plague our culture and constrict our worldview. Through the twin principles of reason and rationality, philosophers risk their social credit scores in the short term to improve civilization in the long run.


Acknowledgements

Thanks to Johnathan Bi for inspiring this essay and to Ellen Fishbein for helping me shape the ideas.