The Death of Expertise: The Campaign Against Established Knowledge and Why It Matters
by Tom Nichols
Tom Nichols’ The Death of Expertise shows how this rejection of experts has occurred: the openness of the internet, the emergence of a customer satisfaction model in higher education, and the transformation of the news industry into a 24-hour entertainment machine, among other reasons. Paradoxically, the increasingly democratic dissemination of information, rather than producing an educated public, has instead created an army of ill-informed and angry citizens who denounce intellectual achievement. When ordinary citizens believe that no one knows more than anyone else, democratic institutions themselves are in danger of falling either to populism or to technocracy or, in the worst case, a combination of both. [edit: book blurb added]
The great irony of the book The Death of Expertise is that the author lacks some of the necessary expertise. He gets some things right, some things wrong, some things horribly wrong. Co-authoring with an expert in cognitive psychology would have helped Nichols avoid some mistakes. Still, there is much good in this book.
To make the review more manageable it will be written in three separate posts with Part I being posted below, and the other two posted as time and other reviews permit. The three posts are:
- The Good: This first part is about the things Nichols gets correct.
- The Bad: This second part is where he commits the same cognitive errors that he just warned us against committing.
- The Ugly: This third part is where he makes those cognitive errors and also omits crucial context from events in order to make them completely different from what actually happened to better suit an ideological narrative. These sections are so egregiously wrong they veer into Bill O’Reilly alternate history territory.
Part I The Good
Books on the “death of expertise” or the “war on science” all touch upon the cognitive biases that allow humans to fool themselves into believing that which ain’t so; or disbelieving things that are ideologically inconvenient. For example, if you think people behave more strangely at the full moon you’ve fallen into the confirmation bias trap.
Another “must” in these types of books is referencing the famous 1999 paper by Justin Kruger and David Dunning “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments” (pdf). Basically, “ignorant and unaware of it”. When you can’t recognize your own level of competence you can’t recognize true competence in the experts. This leads people to dimiss an expert’s views in favour of their own profoundly ignorant views. Google Dunning-Kruger effect. Wiki has a good review, or if you’re inclined towards a video explanation, “Illusion of Ignorance” is a slightly cheeky 7-minute summary.
Not to say “ignorance” is always a pejorative. Nichols references another Dunning article entitled We Are All Confident Idiots. Dunning writes, “Because it’s so easy to judge the idiocy of others, it may be sorely tempting to think this doesn’t apply to you. But the problem of unrecognized ignorance is one that visits us all”.
The problem of unrecognized ignorance happens to those who should know better such as Nobel Prize winners. So many Nobel Prize winners ventured into pseudoscience that Dr. David Gorski coined the term, The Nobel Disease. We need to know when we are moving outside our area of expertise and then behave with utmost humility. Just because I can correct Nichols when he’s ventured slightly into my area does not mean I can also correct him in his area of expertise (Soviet-US relations). Even if I “did my research”, I’d be a fool to contradict him. As Nichols says,
Few words in a discussion with a layperson can make an expert’s heart sink like hearing “I’ve done some research.”
Every instance I recall of someone telling me “I’ve done my research…” is followed by an incorrect statement at the most basic, sometimes grade-school, level. I try to remember how I feel and then try not to inflict that on someone else.
To illustrate his point that expertise is too easily dismissed by the non-expert Nichols tells the story of physicist Robert Jastrow, and a student who argued that Jastrow was wrong in supporting Reagan’s space-based missile defence program (nicknamed Star Wars). After a while, the student said Your guesses are as good as mine. Jastrow responded, “No, no, no, my guesses are much much better than yours”.
Thanks to the hindsight of history we know Jastrow was wrong on technological feasibility, and the student may have been right. [i] However, Nichols writes,
“An uninformed judgment, even when right, is often less useful than a reasoned view, even when wrong, that can then be dissected, examined, and corrected”.
Experts may not always be right, but they are less likely to be wrong than non-experts. And they’re often wrong in interesting and enlightening ways. The race, he says, may not always be to the swift, but that’s the way to bet.
Another excellent point Nichols makes is how people tend to conflate a simple question with the complexities of the long-term outcomes.
“Will Bashar Assad of Syria use chemical weapons at some point in 2013” is an even bet, like putting a chip on one color in roulette. It’s a yes-or-no question…It’s not the same question as “Why would Bashar Assad use chemical weapons?” and it is light-years away from the dilemma of “What should America do if Bashar Assad uses chemical weapons?” The Internet, however, conflates all three of these questions, and it turns every complicated issue into a poll with a one-click radio button offering a quick solution. [emphasis added]
Keep the above example in mind next time you’re discussing complex issues with friends or online. Are you disagreeing about the yes-no, or one of the related complexities?
At the end of the book Nichols seems pessimistic that we can actually change this celebration of ignorance as a virtue. He offers some advice.
He recommends humility. Assume the people writing the stories know more about the subject than you. I’d add a caveat. In my science field a journalist may know more about the details of their story than me, but because they lack the science background their conclusions or emphases are often skewed. From experience I know to check the journalist’s primary sources to see what the authors say rather than what the journalist claims they say.
Another recommendation is be ecumenical. That is, consume media from multiple sources. As a young teen I listened to shortwave radio to pick up Radio Moscow and other stations for their version of daily events. Now I listen to broadcast stations from around the world, and read newspapers from multiple countries. There is beauty and understanding found in discovering different views from your own especially on a global level. My Facebook friends (most of whom I’ve met in real life) are like a United Nations and world religions (and non-religions) conference.
Nichols is skeptical that his recommendations will help although he claims to still have faith in the American system. He thinks it will take a tragedy to cure the ignorance, narcissism and intellectual malaise of the US.
Tragically, I suspect that a possible resolution will lie in a disaster as yet unforeseen. It may be a war or an economic collapse. (Here, I mean a major war that touches America even more deeply than the far-away conflicts fought by brave volunteers, or a real depression, rather than the recession of the early twenty-first century.) It may be in the emergence of an ignorant demagoguery, a process already underway in the United States and Europe, or the rise to power of a technocracy that finally runs out of patience and thus dispenses with voting as anything other than a formality. [Emphasis added]
He points out that Americans shrugged off self-absorption and isolation in 1941, in the trials of Vietnam and Watergate, and again after 9/11 so they can do it again. Each time after though they slide back into the hole, sometimes deeper. At some point, they might no longer see daylight, Nichols writes.
Maybe the pandemic is the disaster Nichols predicts that will make them shrug off self-absorption. COVID-19 is hitting the US harder than other countries precisely because the leadership dismissed expert advice and fell back on comfortable lies instead. Other countries are joking their COVID-19 response plan is “The opposite of what the US did”. Several articles have been written on how America will again recognize the necessity of experts, but lately we’ve seen mass protests of the type that can only happen if the participants dismiss all expert advice. Instead of shrugging off self-absorption to look out for the good of society as a whole it seems Americans are doubling down on their narcissism.
Nichols seems to be pessimistically hopeful. Within a year or two we’ll see if the hopeful part is warranted. I hope he’s right. I fear he is not.
[i] Even in 1983 physicists were saying Reagan’s Star Wars program would not work because ye cannae change the laws of physics (they may not have worded it exactly that way). Over a thousand signed a letter saying they wouldn’t accept funding for doing a technologically impossible project. Angry at his colleagues, Jastrow and two others formed an ideological think-tank to push the missile defense initiative. As it became apparent the project wasn’t feasible the think-tank switched priorities and went on to challenge other scientifically accepted findings such as the dangers of second-hand tobacco smoke, the ozone hole, and climate change. None of the men were experts in any of those fields yet they dismissed the evidence from the world’s experts because their ideology required it. Their “arguments” against the scientific consensus on these topics often were—and still are—wrong at a high-school level. Being an expert does not make you immune to cognitive pitfalls, and sometimes it is the expert who helps slay expertise.