Excerpts from the book of Tim Urban
As the times get better, they also get more dangerous
When we learn a technology lesson, we tend not to forget it. But wisdom lessons don’t always seem to stick. Unlike technological growth, wisdom seems to oscillate up and down, leading societies to repeat age-old mistakes.
Why do we believe what we believe? Our beliefs make up our perception of reality, drive our behavior, and shape our life stories.
Trust, when assigned wisely, is an efficient knowledge-acquisition trick. If you can trust a person who actually speaks the truth, you can take the knowledge that person worked hard for—either through primary research or indirectly, using their own diligent trust criteria—and “photocopy” it into your own brain. This magical intellectual corner-cutting tool has allowed humanity to accumulate so much collective knowledge over the past 10,000 years that a species of primates can now understand the origins of the universe. But trust assigned wrongly has the opposite effect. When people trust information to be true that isn’t, they end up with the illusion of knowledge—which is worse than having no knowledge at all.
Confirmation bias is the invisible hand of the Primitive Mind that tries to push you toward confirming your existing beliefs and pull you away from changing your mind. You still gather information, but you may cherry-pick sources.
Living simultaneously in multiple cultures is part of what makes being a human tricky.
People are meant to be respected, ideas are meant to be batted around and picked apart.
Everyone can do whatever they want, if they have the power to do so —into a compromise that goes something like this: Everyone can do whatever they want, as long as it doesn’t harm anyone else.
If the country is a car, progressives are in charge of the gas pedal. In the country car, conservatives manage the brakes.
I’ve never been sure if those are objectively the best four Disney movies or if everyone just loves whichever Disney movies came out when they were between the ages of 7 and 12.
There’s also the “inoculation effect,” a term coined by social psychologist William McGuire in 1961. The trick of many of our vaccines is to expose a person’s immune system to a weak version of a dangerous virus. After the body defeats the weak version of the virus, it develops an immunity against all versions of the virus, including the strong ones. McGuire found that people’s beliefs worked in a similar way: being repeatedly exposed to weak arguments for a particular position makes people dismissive of all arguments for that position.
Small individual bias can lead to large collective bias
Depicting both sides as equal when they’re not (aka “bothsidesism”) is not neutral, but is biased toward “presenting the sides as equal.
Bigotry is at its most dangerous when it goes unrecognized.
As the old saying goes, “autres temps, autres moeurs. Other times, other customs.
While it’s certainly admirable to have been ahead of your time on a moral issue, punishing or disgracing someone for saying or doing something in the past that was prevalent at that time but considered taboo today makes little sense.
Throughout human history, clever opportunists have discovered that if you could control what people say, you could write the story people believed.
If we want to help repair our societies and restore their defenses, we have to fully understand the problem.
Marcus Aurelius once wrote, “If it is not right, do not do it. If it is not true, do not say it.”