What follows is a book review for a book I haven’t read yet (I just ordered it). I’m reprinting this here not only to point to what seems like a great read, but because the review itself highlights some of my all-time favorite cognitive biases and thinking errors. While neither the book nor the review highlight how these thinking glitches affect people in the spiritual and psychological growth game (those playing the game and those who think they’re running the game or making up the rules), I’m sure you’ll be able to make the connections yourself.

In fact, as you read about these biases, notice if you think, “I don’t do that one.”

If you have that thought, let me suggest the following: “Yes, actually, you do.”

And if you get offended by *that*, remember, I don’t know who you are! It’s not personal 😉 Oh, and also remember: Self-righteous indignation is usually (like, always) a sign of guilt 😉

I know you do these things because we ALL do these things. You, me, and all the other 112 billion people who’ve ever been on the planet. And one of my favorite cognitive biases is the thought that while every other human being who has ever lived does it… *I* don’t because *I* am special.

So, with that, may I present:

The Matrix of the Brain

a book review by David Ludden
originally published in the eSkeptic Newsletter

In the 1999 science-fantasy film The Matrix, people have been plugged into a giant computer that creates a virtual reality that is both pleasing and plausible. A few renegade humans have unplugged themselves from the Matrix, only to wake up to a miserable underground existence below a war-scorched Earth. In a similar fashion, our brains generate a comforting version of reality that protects us from the desolation and despair of the real world.

We implicitly trust the products of our brains — our perceptions, our memories, our judgments, our sense of self. We say, “I know what I saw,” and we ask, “How could I forget?” After all, if you cannot trust your own brain, who can you trust? But all is not as it seems. More than half a century of cognitive and social psychology research has shown that much of what we see, remember, and think is an illusion. In her new book A Mind of its Own, Cordelia Fine lays out in a highly entertaining fashion the myriad ways in which our vain, immoral, pig-headed brains are constantly deceiving us.

Although we like to think of ourselves as rational beings, our brains covertly strive to create for us a view of the world and of ourselves that is self serving but not necessarily consistent with reality. Beliefs and opinions are formed quickly and become part of how we define ourselves, so the brain selectively perceives and recalls evidence that supports cherished beliefs while disregarding or forgetting evidence that contradicts our beliefs. Fine calls this “motivated skepticism.” We are naturally skeptical of anything that challenges our beliefs, but accepting of anything that bolsters our beliefs, and hence our egos. For example, it is for us easy to mock the tenets of other religions — “How could they possibly believe that?” — while swallowing whole the equally far-fetched teachings of our own church.


Motivated skepticism can even lead to belief polarization, a process whereby counterevidence only strengthens the convictions of our beliefs. The counterevidence is strenuously scrutinized for any weakness, which is then used to diminish the validity of evidence for our opponent’s point of view. Our selective perceptions are further bolstered by illusory correlations. These are caused largely by selective memory, in which we remember only supporting examples but not counterexamples. For instance, if you already believe the stereotype that all Asians are shy, you will only recall experiences that support this stereotype. When confronted with an assertive Asian, the reaction is likely to be: “Yes, but she grew up in America.” In such a fashion, counterexamples are simply dismissed as aberrations.

Our brains also trick us into believing we have more control over situations than we really do. We blow on dice and perform other rituals to influence events. We also feel safer driving than flying because we think we are in control behind the steering wheel. This is especially true when things turn out in our favor. For instance, we take the credit for picking a winning lottery ticket, but blame a losing ticket on bad luck. It would seem that going through life deluded by our own brains would not be a good thing, but that is not necessarily the case. Some people have markedly more balanced self-perceptions than normal people — they know clearly what their limitations are and how little control they actually have over their lives. They are also clinically depressed, and seeing reality for what it is, they become overwhelmed and lose the desire to go on living. So it seems that our brains delude us to keep us happy, healthy and ready to face life’s challenges. In fact, people who are generally optimistic tend to live longer.

Emotional arousal also plays an important role in cognitive functions. Brain damage can create a mismatch between emotion and rational thought. People who cannot experience arousal during the decision making process, for example, become incapable of making decisions or consistently make poor choices. It seems that the gut feeling we get when faced with a choice is more important than any rational decision-making process.

The experience of emotion is also integral to our sense of self. In a condition known as Capgras syndrome, patients no longer feel any sort of arousal in the presence of family members, and so they become convinced that their loved ones have been replaced by impostors. Others lose the ability to feel emotion altogether; they also feel detached from themselves and lose all interest in life. Even healthy individuals experience this depersonalization sometimes, particularly during traumatic experiences. Afterward, people report a feeling of detachment from the events around them and even from themselves. This seems to be a coping mechanism of the brain to keep it from becoming overwhelmed.

By three quarters of the way through the book, the reader is yearning for a return to blissful ignorance, as there seems to be no escape from what Fine calls “our innate lack of scientific rigor.” But still there is hope. Fine advises that we “[t]reat with the greatest suspicion the proof of [our] own eyes.” In other words, we need to trust in the scientific method to lead us out of the tangle of deceptions our brains weave around us. As with any other behavior, modes of thinking can be practiced until they become automatic, and so Fine is hopeful that practice in critical thinking can help guard us from the extravagances of our own brains.

One of the strongest points of this book is the way Fine deftly describes how research is done in psychology. She does not just tell what is known about how the brain deceives, she explains how we know it. In friendly terms, she presents hypotheses to test, clearly describes how experiments are set up, and shows us how reasonable conclusions are draw from the data. Thus, she demonstrates how the scientific method can be used to overcome our false beliefs and misconceptions.

Life is pleasant inside the virtual reality of our minds. So what if we think we are more intelligent or virtuous than others and believe we are more in control than we really are? Such minor self-deceptions are, for the most part, harmless, and they may help us to get through the day. But we are not necessarily prisoners of our minds. When the deceptions become harmful to ourselves and others, there is a way out. Science gives us a way to unplug ourselves from the Matrix of our brains.

Click here to order > A Mind of its Own: How Your Brain Distorts and Deceives


Comments

2 responses to “We *are* in the Matrix”

  1. how was the book? 🙂

  2. It was great… funny, insightful, and a great overview of the research.

    One of my favorite parts was when Cordelia described a study showing how we are willing to believe things that are 100% completely false… and don’t drop the belief when we’re shown that it’s false!

    BUT, once the people in the study were told that it’s just human nature to hold onto a false belief… THEN, the seemed to drop it!

    So, the meta level understanding is necessary for change… merely arguing an opposing point does nothing.

    Hmmmmmm….