Kluge-the Haphazard Construction of the Human Mind

klugeKluge is a readable book written by psychologist Gary Marcus and designed for the popular audience.  The way it is written makes it a good choice to outline the biases and other shortcomings of our minds. Of course, a kluge is old computer slang for patch or something that is not elegant, but still works.  Possibly, I should contrast Kluge with In Pursuit of Elegance by Matthew May.  Later.  Marcus starts with memory.

We have what Marcus calls a contextual memory as opposed to a postal code memory like a hard drive.  We pull things out of memory using contextual cues.  We give hints to our brain and see what it comes up with.  Unfortunately, some of the cues are actually related to the environment where the memory was created. So if you are trying to remember the genus and species of a plant that you looked up in the garden, then the garden is the place to go to remember it.

Context-dependent memory does have its strengths.  It prioritizes by bringing most quickly to mind things that are common, things that we’ve needed recently, and things that have previously been relevant in situations that are similar to our current circumstances.  It also seems to work in parallel, searching several threads at a time, to compensate for those rather slow neurons.  The weakness of context dependent memory is reliability.

Anchoring our memories in terms of context and cues, rather than specific pre-identified locations, leads to another problem: our memories often blur together.  This can lead to false memories and makes us vulnerable to being led by the most recent context.  Remembering the gist rather than the detail is bad enough, but we often think we remember the detail.  This is not good in the courtroom.

Every time we access a memory, it becomes “labile,” subject to change, and this seems to be true even for memories that seem especially important and firmly established.

There are things that can be done to improve our poor memories. Take for example, the ancient “method of loci”.  If you have a long list of words to remember you can associate each one with a specific room in a familiar large building: the first word with the vestibule, the second word with the living room, the third word with the dining room, the fourth with the kitchen, and so forth. There are also rhymes and repetitive rote memorization.  Still most of us realize the weaknesses of our memories and will even admit, while we are unlikely to admit any weakness in our judgment.

Marcus includes a quotation from Steven Pinker: “To a very great extent, our memories are ourselves.”  This might be a good reason to have not so good memories. Nevertheless, I agree with Marcus’s conclusion:  “To build a truly reliable memory, fit for the requirements of human deliberate reasoning, evolution would have had to start over. And, despite its power and elegance, that’s the one thing evolution just can’t do.”

2012-09-20 14.31.18Marcus moves on to belief. He notes that as a species, we are only to ready to be fooled. All told, though the systems that underlie our capacity for belief are powerful, they are also subject to superstition, manipulation, and fallacy.

If we have positive feelings toward a given person in one respect, we tend to automatically generalize that positive regard to other traits, an illustration of what is known in psychology as the “halo effect.” The opposite applies also.  This is, as noted by Malcolm Gladwell, probably why Warren Harding was elected president. Another phenomenon, called the “focusing illusion,” points out that we can manipulate people simply by directing their attention to one bit of information or another, before asking them a question. A similar phenomenon, which has come to be known as “anchoring and adjustment”.   This is the one where you add 400 to the last three digits of your cell phone number and then answer in what year did Attila the Hun’s rampage come to an end?  On average, you will guess a higher number if your cell phone number is larger.

We also like the familiar.  For example, people tend to prefer social policies  that are already in place to those that are not, even if no well-rounded data prove that the current policies are working. Rather than calculate costs and benefits people often use this simple heuristic: “lf it’s in place, it must be working.”

Marcus notes that our thinking can be divided into two streams, one that is fast, automatic, and largely unconscious, and another that is slow, deliberate, and judicious. The deliberative system is much newer and more or less added on to the older reflexive system.  As evolution layered reason on top of contextually driven memory, it left us with the illusion of objectivity. We feel as if our beliefs are based on cold, hard facts, but often they based on whatever that old system left us with.  Haidt’s elephant with a driver might sum it up better.

No matter what we humans think about, we tend to pay more attention to stuff that fits in with our beliefs than stuff that might challenge them. Psychologists call this “confirmation bias.” When we have embraced a theory, large or small, we tend to be better at noticing evidence that supports it than evidence that might run counter to it. Marcus suggests  that confirmation bias might be an inevitable consequence of contextually driven memory. Because we retrieve memory not by systematically searching for all relevant data (as computers do) but by finding things that match, we can’t help but be better at noticing things that confirm the notions we begin with. While confirmation bias is an automatic tendency to notice data that fit with our beliefs, motivated reasoning
is the complementary tendency to scrutinize ideas more carefully if we don’t like them than if we do.

While all normal human beings acquire language, the ability to use formal logic to acquire and reason about beliefs may be more of a cultural product than an evoIutionary one, something made possible by evolution but not guaranteed by it.  Rational man, if he existed, would only believe what is true. Irrational man, kluged product of evolution that he is, frequently moves in the opposite direction, starting with a conclusion and seeking reasons to believe it.  Marcus suggests that belief is stitched together out of three fundamental components: a capacity for memory , a capacity for inference, and a capacity for perception. The ideas we encounter are, other things being equal, automatically believed – unless and until there is a chance to properly evaluate them.

Marcus concludes the belief chapter by saying that evolved of creatures that were often forced to act rather than think, Homo sapiens simply never evolved a proper system for keeping track of what we know and how we’ve come to know it, uncontaminated by what we simply wish were so.

Next comes choice. Framing has the power that it does because choice, like belief, is
mediated by memory. And, as we have already seen, the memory that evolution equipped us with is inherently and inevitably shaped by momentary contextual details. Change the context (here, the actual words used), and you often change the choice. “Death tax” summons thoughts of death, a fate that we all fear, whereas “inheritance tax” may make us think only of the truly wealthy, suggesting a tax scarcely relevant to the average taxpayer.

Other shortcomings in our choices come from emotions abilities to impact our choices.  Hunger and lust are telling examples.  Then we have our steep discounting of the future.  By most measures, we vastly over value the present. None of these makes our choices better.  Marcus enjoys dwelling on the haphazard nature of the human mind. I think he probably overdoes it.  But he concludes his book with thirteen suggestions.

  1.  Whenever possible, consider alternative hypotheses.
  2.  Reframe the question.
  3.  Always remember that correlation does not entail causation.
  4.  Never forget the size of your sample.
  5.  Anticipate your own impulsivity and pre-commit.
  6.  Don’t just set goals. Make contingency plans.
  7.  Whenever possible, don’t make important decisions when you are tired or have other things on your mind.
  8.  Always weigh benefits against costs.
  9.  Imagine that your decisions may be spot-checked.
  10.  Distance yourself.
  11.  Beware the vivid, the personal, and the anecdotal.
  12.  Pick your spots. Reserve your most careful decision making for the choices that matter most.
  13.  Try to be rational.

Marcus, G(2008) Kluge:  Haphazard Construction of the Human Mind. New York:  Houghton Mifflin.

 

2 thoughts on “Kluge-the Haphazard Construction of the Human Mind

  1. Pingback: Memory Simulation-Why the brain talks to itself: sources of error in emotional prediction | Judgment and Decision Making

  2. Pingback: Kaku's Future of the Mind - Judgment and Decision Making

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.