The Knowledge Illusion: Why We Never Think Alone by Steve Sloman and Philip Fernbach

The authors of this book tackle three central ideas: why are people ignorant, why do people think they are knowledgeable when they are actually ignorant, and the importance of recognizing that knowledge is communal. They use a combination of scientific studies, philosophy, and real-world examples to address these issues.

The modern world is extremely complex with technologies that make life very convenient. However, how much does the average person really understand about these technologies? Oh sure, you know how to push the button on your Keurig machine and brew an instant cup of coffee. If you opened up the Keurig and dissected it, would you know what each and every part does? Do you really understand how it works? For that matter, do you know how your toilet works? Yes, you can flush the toilet and use it for your practical purposes. But what happens once you push that lever? The average person probably could not give a detailed causal explanation about what happens underneath the toilet once you flush, how each component works, and the scientific principles behind it.

People often overestimate their knowledge about how the world works. They think they know how the toilet works, how the Keurig works, or even all the nuances behind complex political policies or social problems. As many studies have shown people experience what psychologists call an illusion of explanatory depth.  One way psychologists have studied this phenomenon is by having people rate on a scale from 1 to 7 how well they think they understand how something works (like a zipper or a toilet or a computer), then they ask these same people to describe in detail all the steps of how the object works, and finally they ask them to rate their knowledge a second time on a scale from 1 to 7. Typically, people rate their knowledge lower after they are forced to explain how something works and realize they can’t do this.  The illusion of explanatory depth goes beyond technical knowledge as well. Similar studies have been done involving people’s political positions on controversial policy issues. People rated how well they thought they understood a particular political policy. Then they were asked to generate causal explanations and explain how each of these policies actually worked with step-by-step details. When many people realized they couldn’t do this, they lowered their political extremity. The authors of the book also mention that there was a control in some studies where they did the same basic procedure, but instead of asking people to explain in detail how a policy worked, they simply were asked to explain their reasons for holding their position on the policy issue. In those instances, people had no shift in attitude.

Humans are causal reasoners. We evolved to have two types of reasoning as outlined in dual-processing theory. These two types of causal reasoning are fast versus slow, intuitive versus reflective, shallow versus deep. The two types of reasoning can lead to different conclusions. By asking people to reflect on how something works with detailed causal explanations it seems to force people to activate the more reflective type of thinking and make them deliberate on their lack of knowledge. Tests such as the Cognitive Reflection Test has shown that some people are naturally more deliberative and reflective thinkers and less prone to illusions of explanatory depth.

Often it’s the very complexity of our knowledge and technology that fuels the problem. The internet is a wonderful resource, giving us access to a great deal of knowledge in the world, but this can give us the illusion that we know the information ourselves. People often confuse the knowledge in their heads with the knowledge outside of themselves; they confuse the knowledge they possess with the fact that they know where to get the knowledge if needed.

Experts are also less prone to mistakes in knowledge, especially as it relates to their own field. After all, that is why they are experts; they have put a huge investment in time to learn the information in a field, its methods of knowing, and specialize in particular sub-areas. However, the authors note that even experts and scientists are prone to illusions of knowledge, which can lead to some catastrophic blunders. One example is the detonation of Castle Bravo where scientists underestimated the power of nuclear reactions. Likewise, many individual academics, and sometimes groups, will fail to accept new ideas that don’t conform to previous conceptions. It takes long periods of time for those new ideas, if they have validly and evidence, to replace the old. It is also important to realize that the financial security of many scientists, archaeologists, historians, theologians, and other types of academics depend on convincing the larger world and their fields of their importance. In other words, we always need to keep in mind that even academics of various stripes aren’t completely unbiased neutral individuals. They, too, get a pay check usually tied to their expertise. That engineer has reasons to convince you that bridge needs repair, that chemist working for the pharmaceutical company has additional reasons for you to buy that new medicine. However, at the same time, while acknowledging this facts, the expert is still our best to get accurate knowledge. He belongs to an entire field, an entire community of other experts, who will challenge and test his or her ideas, others who will sift through their results, and others who are performing their own studies to compare with previous results. Likewise, the individual expert is not only a master of his particular sub-domain, but also understands what they don’t know yet about their direct area of study or even a different area of study within their field.

 

This leads to the most important idea of the book: that knowledge is inherently communal. We can’t be an expert in everything. There is simply too much to learn in a modern society. Very few people can be an expert in one field, let alone multiple fields. Therefore, we need to accept the inevitability of ignorance; not only the ignorance of other people, but our own ignorance. As the authors point out, the issue isn’t ignorance itself, which can’t be avoided, it’s our failure and unwillingness to recognize our ignorance. However, we don’t have to know everything since we belong to a community of knowledge.  There are other people who are experts who can answer questions when they arise and there are areas where we are experts. We each have a part to play in the community of knowledge. This has important implications for education. Most people conceptualize education as having the purpose of creating independent thinkers or to learn key facts from various important fields of knowledge. Instead we should become experts in areas that interest us, develop general critical thinking skills to evaluate areas that we don’t know much about, and knowledgeable about how to locate information when we do need it.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s