Our mind is the most advanced computer we know about. It can perform tremendous feats. Yet, it is fooling us a lot more than most of us would care to admit. The reason for this is that the mind takes shortcuts to save energy and speed up our thinking.
In this article I will present how science now believes that the brain works, the problems it has, and suggestions about what to do about it.
Our Incredible Mind
Imagine you are riding a bicycle into an intersection. Cars, motorcycles, mopeds and other bikes are coming from all directions. Your brain takes in the whole scene and makes instantaneous decisions about what route to take. You communicate both consciously and unconsciously with the other drivers and you cross the intersection as if it was nothing.
This is an example of what our incredible mind can do. But, in order to do this it takes shortcuts and these shortcuts are not always appropriate. The rest of this article will discuss the problems that occur when the shortcuts are not to our advantage.
What is belief? Why do we believe the things we do? What do we truly know? When we start to really analyze our beliefs we often realize that we don't know why we believe in something, we just do. And, we may also know that something is not correct but still act as if it is.
Can you get a cold from being cold?
No! The only way to get a cold is by being exposed to the cold virus. If you catch a cold after being cold it is only a coincidence. Yet, many of us tell our children to dress warmly to avoid getting a cold!
We think that our perception is infallible. We think we see what is real! This is not the case, our senses are easily fooled and also affected by what we expect to experience.
Which one of A and B is lighter?
It is a trick question, they are both the same color as we see in this picture. Yet, even when we know this, it is impossible to see!
Can you see anything in this image? Can you see the dalmatian?
If we draw the contour of the dalmatian it becomes obvious. But now, if you look at the above picture. Can you not see the dalmatian? Our perception is influenced by what we expect to see.
Watch this film and try to count the passes made by the white-dressed basketball players.
Did you get the count right? Did you see the gorilla? In the original study about half of the people that were shown this film didn't see the gorilla! Being focused on one thing can make us completely miss another.
This happens to us all the time in real life. People look at the same situation and interpret it completely differently.
We believe that we remember things as they actually were, but in reality our memories are reconstructed every time we remember something. We fill in new details.
Source and Truth Amnesia
We have a tendency to forget the source and the truthiness about facts that we know. We remember the facts, but we don't know where they come from or if they are true or not!
We may have heard about a correlation between vaccines and autism. But, we forgot, the minor detail, that there is a not even a very weak correlation between them. Hence, we refuse to vaccinate our kids since we don't want them to become autistic.
Vivid memories, memories involving strong feelings, makes us remember things more strongly. It makes us more confident about our memories being correct. Just because the memories are stronger does not mean that they are more correct. We simply believe in them more.
Memories also fuse together to form new composite memories, that may not resemble what really happened at all. Do you remember your tenth birthday or do you remember what your mom told you or what you have seen in pictures?
We cannot tell if our memories are fake or if they really happened. Everything we remember seems real to us!
Humans are also very good at pattern recognition. This allow us to detect and categorize people, animals, and things. But, our pattern recognition also shows us things that are not there. Was there really a dalmatian in the spotted picture above?
Agent detection is an inclination for humans and animals to detect an intelligent agent in situations that may or may not involve one. We see and hear things that aren't there.
We detect a bush blowing in the wind as a person hiding. We see a rope lying on the trail as a dangerous snake.
Our mind processes our perceptions and memories and creates our reality into a coherent story. The story need not be correct, it must only be consistent. In order to keep the story consistent our mind makes up the details it needs to.
In a study of split-brain patients, the patients were shown images. One image per eye. The split-brain condition prevents the two parts of the brain from communicating with each other.
In the depicted example, the patient was shown two images: one eye was shown a chicken foot, the other eye was shown a snowy landscape. The patient then had to pick a related image from a number of other pictures. The patient in the study picked a hen and a snow-shovel with each hand respectively.
When asked why he picked the images, his verbal side of the brain answered. "I picked the hen because I saw a chicken's foot and I picked the shovel because I need a shovel to clean out the hen house."
His mind made up story that was consistent with why he had a shovel in his other hand.
Our mind can make things up to make our life story consistent!
A bias is a prejudice. A cognitive bias is a type of error in thinking that occurs when we are processing and interpreting information in the world around us.
Cognitive biases are often a result of our attempt to simplify information processing. They are rules of thumb that help us make sense of the world and reach decisions with relative speed.
Unfortunately, these biases sometimes trip us up, leading to poor decisions and bad judgments.
The anchoring effect describes the human tendency to rely to heavily on the first piece of information offered, the anchor, when making decisions.
If I ask a group of people "If more or less than 20 percent of the mammals have four legs?" and then ask the same group to guess the specific percentage of animals that have four legs. I commonly get a lower percentage than if I initially had asked "If more of less than 80 percent of the mammals have four legs?".
We anchor to the number presented to us. This is the same technique that is used by salesmen when they offer you a good deal of only 20 thousand dollars for the second-hand Volvo.
How many percent of the population do you think are allergic to gluten? How do you go about making such an estimation? What I often do is to think about the people around me. How many of them are allergic to gluten? It seems like quite a lot. I would guess about 10 percent of the people I know are allergic, so that is my reply.
This is the availability heuristic at work. Why should my tiny number of acquaintances have anything to do with the rest of the population in the world? But, this information is readily available to me and it is easier for me to just guess from this information than to think through the problem thoroughly.
Fundamental Attribution Error
Say you are walking in the street and stumble and fall. The common way we react to this is that we make up an excuse to why we fell, a hole in the pavement, etc. It is not my fault, there was a hole in the pavement. Perhaps, we even get angry, someone should really fix that!
If someone else stumbles and falls in the same spot, we readily label that person as being clumsy or careless.
We attribute our mistakes to external causes and other's mistakes to the person. We also give ourselves credit for good things we do, but other people's good deeds we attribute to luck or coincidence. This is the fundamental attribution error.
Hindsight bias is also known as the "I-knew-it-all-along" effect. It is the tendency to see past events as being predictable at the time those events happened. (This picture does not really convey this bias, as the outcome can probably be predicted beforehand :)
An example of this is the 9/11 bombings, when the event had happened it was easy to find clues that informed about a coming attack. Clues like this exist all the time for things that never happen, but we don't focus on those because they are not relevant.
This is the mother of all biases! A bias that we, all of us, fall into every day. It is the tendency to search for or interpret information in a way that confirms our beliefs. Or, to notice events that confirms our beliefs while ignoring events that disconfirms them.
Do I put the seat down when I have been on the toilet? All the time, I say. Never, my wife says. How can this be? How can I and my wife come to completely different conclusions from the same data?
The reason is that I notice the times when I remember to put the seat down, since I have to think about this and therefore remember it. I don't remember the times when I don't do it since, I don't even notice them.
For my wife it is the absolute opposite, she only notices when I forget to do it and doesn't notice when I do.
When we read an article that we agree with, it is easy to think, "Yes, that is the way it is!" and move on. If we read an article that we don't agree with, we can go to great lengths to examine the "erroneous" arguments to disconfirm them.
The human mind is really bad at working with large numbers and probability.
The tendency to think that future probabilities are altered by past events, when in reality they are unchanged.
Flip a coin ten times in a row and it turns out tails every time? How likely is it that we will flop a heads the next time. The answer is, of course, 50%. In this scenario most of us will know this is correct, but in many other scenarios we tend to think that the other option is due and hence calculate it as more likely to occur.
What is the odds of one person winning a lottery? Not very high, maybe one-in-a-billion, depending on which lottery it is. But often times this is not the real question to ask ourselves. We should often ask: What is the odds of anyone winning the lottery? It turns out that the odds for this are, usually, pretty good.
Imagine you dream that someone dies. When you wake up the next day it has really happened. What are the odds of this happening to you? I must be a miracle. No! The correct question is: What are the adds of this happening to anyone?
Base Rate Neglect
John is a man who wears Gothic inspired clothing, has long black hair, and listens to death metal. How likely is it that he is a Christian and how likely is it that he is a Satanist?
We have a tendency to answer that it is more likely that he is a Satanist. But, this ignores the base rate. The fact that there are 2 billion Christians and only, maybe, 2 million Satanists. With that base rate in place, it is much more likely that John is a Christian who likes wearing Gothic clothing, has long black hair and listens to death metal.
This is the tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data.
The clustering illusion explains the "hot-hand" in basketball. The hot-hand is the belief that a player who has made a few baskets is more likely to make the next basket since he is on-a-roll.
Imagine a disease that 1% of the population has. Assume there is a test with 99% certainty of being correct. 1% false positives and 1% false negatives.
What is the probability that you have the disease if after taking the test it shows positive?
Our natural inclination to answer this question is, "Bloody sure!". But, in reality the probability of us having the disease is only 50%. Google it, if you don't believe it!
So, we believe things, but we don't know why. Our perception is severely influenced by what we already believe. Our memories are flawed. We see patterns and agents that don't exist. So what? This doesn't apply to us anyway, right?
It turns out that it does. Smarter people are better at rationalizing their beliefs than other's. We still make the same mistakes, but we are better at coming up with credible explanations as to why it is not a rationalization.
"I doubt it!" is not only a proper response to what other people say. It is also an appropriate response to our own thought and ideas.
Scientific skepticism holds that science is the best way to find out things about the world and ourselves. Scientific skeptics don't trust claims made by people who reject science or who don't think that science is the best way to learn about the world.
Scientific skeptics don't say that all extraordinary claims are false. A claim isn't false just because it hasn't been proven true.
It's possible pigs can fly, but until we see the evidence we shouldn't give up what science has taught us about pigs and flying.
Thinking about thinking! When you learn new facts, be aware of all the fallacies and biases mentioned in this article. This will help prevent you from making some mistakes.
Bias Blind Spot
The bias blind spot is the cognitive bias of failing to compensate for one's own cognitive biases. Even if we know everything I've written about here, we have a tendency to underestimate our potential for self-deception. To see ourselves as rational beings is the greatest self-deception of all.
The first principle is that you must not fool yourself - and you are the easiest person to fool. -- Richard Feynman
- Wikipedia's list of cognitive biases
- Thinking fast and slow, Daniel Kahneman
- How we know what isn’t so, Thomas Gilovich
- Fooled by Randomness, Nassim Taleb
- The Drunkards Walk, Leonard Mlodinow
- Naked Statistics, Charles Wheelan
- The Five Elements of Effective Thinking, Edward B. Burger, Michael Starbird