Gerd Gigerenzer and Reinhard Selten provide a good summary of this topic in “Rethinking Rationality.” This is the introductory chapter to Bounded Rationality, which they edited. To connect the rational and the psychological, bounded rationality includes emotions, social norms, imitation, and other cultural tools. The initial modern definition of rationality was created in about 1654 by Blaise Pascal. That definition was simply to maximize expected value. Blaise Pascal saw it as creating a new form of morality as well–replace faith with moral expectation. That definition ran into trouble with the St. Petersburg paradox, and Allais, and Ellsberg paradoxes. So David Bernouilli redefined rationality as maximizing expected utility. Bounded rationality was created by Herb Simon in the 1950s. Optimization was not part of this new rationality. The metaphor for bounded rationality was a pair of scissors, one blade is “cognitive limitations” and the other is the “structure of the environment.” In Herb Simon’s words written in 1956: “a great deal can be learned about rational decision making…by taking account of the fact that the environments to which it must adapt possess properties that permit further simplification of its choice mechanisms”. Gigerenzer argues that simple and robust heuristics can match a specific optimizing strategy. Different researchers see “bounded rationality” differently. Gigerenzer says it is not optimization nor irrationality nor is it optimization under constraints which includes limited search, but then requires cost/benefit of further search. Gigerenzer’s model of bounded rationality uses fast and frugal stopping rules (when to stop search) that do not involve optimization. Kahneman and others have pointed out humans falling short of rationality in such things as the base rate fallacy and conjunction fallacy. Gigerenzer & Selten contend that these fallacies are “based on norms that have been put forth without analzying the structure of the environments… Moreover, when information is presented in natural frequencies rather than probabilities, base rate neglect is perfectly rational.”
Kluge is a readable book written by psychologist Gary Marcus and designed for the popular audience. The way it is written makes it a good choice to outline the biases and other shortcomings of our minds. Of course, a kluge is old computer slang for patch or something that is not elegant, but still works. Possibly, I should contrast Kluge with In Pursuit of Elegance by Matthew May. Later. Marcus starts with memory.
Richerson and Boyd have been spokesmen for explaining how culture transformed human evolution. Their book: Not by Genes Alone is the source of the quotations below. The “evolutionary properties of symbolically marked subgroups (tribes, religions, universities, etc) give rise to many problems and conflicts in complex societies.” However, without the tribal social instincts, we cannot explain why our societies are so different from other primates. The institutions of our complex societies are built on ancient and tribal instincts that have imperfections coming from cultural evolutionary processes.
Valerie Reyna has inserted development as humans transform from young children to adulthood and on to advanced age as a means to study decision making. In fuzzy trace theory, there is gist-based intuition and verbatim based intuition. They do not develop in the same way. In fact, there are development reversals in which biases increase with age. During childhood, verbatim based calculation skill increases continuously, but gist based bottom line meaning in many situations increases more quickly. Verbatim and gist representations are encoded and processed in parallel. Which controls task performance depends on which is more accessible and on the constraints of the task. In general, gist is more accessible and more useful, especially when informed by age and experience.
Reyna finds that the argument that adolescence is a time of increased irrationality is flawed. Preference for risk exhibits a fairly stable downward trend across age. She concludes that precise calculation of risks and rewards promotes risk taking among adolescents, whereas simple all-or-none gist based intuition protects adults against unhealthy risk taking. Reyna also sees inhibition as a third factor which increases with age during adolescence and young adulthood.
Reyna concludes that mature adults differ from adolescents in more than the ability to rein in responses to tempting rewards. If offered a million dollars to play Russian roulette, an adult quickly refuses. An adolescent probably does a quick cost-benefit analysis. Determining if the reward is worth the risk sounds smart and it fits a traditional definition of rationality. To an adult, it is categorically crazy, a gist based, all-or-none intuition.
Peters et al discuss the age related impacts of cancer decision making. In familiar situations, older adults are benefited by their reliance on gist-based intuition. However, when complex or changing rules must be learned, older adults may make poorer decisions. Understanding of numeric information is likely to be poorer, also. Thus, clinicians should emphasize affective meaning with verbal labels such as excellent and fair to interpret numeric risk information.
Betsch and Glockner point to the need to do research on how children intuitively integrate information. The limited research shows promise. Ebersbach demonstrated that kindergarten children are capable of spontaneously integrating three orthogonally varying stimulus dimensions in their judgments of volume.
Betsch, T & Glockner A (2010) Intuition in Judgment and Decision Making: Extensive Thinking without Effort. Psychological Inquiry, 21: 279-294.
Peters, E; Diefenbach, M; Hess, T; Vastfjaii, D;(2008) Age Differences in Dual Information-Processing Modes: Implications for Cancer Decision Making. Cancer December 15: 113(12 Suppl):3556-3567.
Reyna, V(2012) A new Intuitionism: Meaning, memory, and development in Fuzzy-Trace Theory. Judgment and Decision Making, Vol 7, No. 3, May 2012, pp. 332-359.
Intuition is slippery to define. Robin Hogarth used thoughts that “are reached with little apparent effort, and typically without conscious awareness” in Educating Intuition. Gigerenzer and his associates call humans “homo heuristicus” and emphasize effort reduction and selective information processing. For instance, the lexicographic heuristic has us starting by comparing alternatives on the most important dimension. If there is a difference, we choose the best and do not seek any more information.
I find the discussion by Tilmann Betsch and Andreas Glockner in “Intuition in Judgment and Decision Making: Extensive Thinking Without Effort” appealing. They see heuristics as just a part of intuition. Heuristics, largely, seem to simplify analytic thought by leaving out effort filled information processes or by reducing the amount of information considered. Betsch and Glockner claim that “intuition is capable of dealing with complex tasks through extensive information processing without noticeable effort.”
On March 12, 2013, The Wall Street Journal featured “The New Power of Memory” in the Personal Journal. Shirley S Wang reports that research published in the journal Cerebral Cortex suggests that we use the vivid memory of past experiences to imagine and anticipate the future. Without a subscription, I was unable to access: “Imagine All the People: How the Brain Creates and Uses Personality Models to Predict Behavior,” but I did find “The Future of Memory: Remembering, Imagining, and the Brain” by the authors mentioned, Daniel Schacter and Demis Hassabis in the November 21, 2012, issue of Neuron. The research included brain imaging that showed “that when people are asked to imagine the future as they recall past experiences, many of the same regions of the brain–the hippocampus and the medial prefrontal cortex–show increased activity”. As reported by Ms Wang, the new research scanned the brains of 19 young adults asked to imagine four short scenes involving imaginary people. Prior to this, the participants were given made-up profiles of four people that included photos and statements. Each profile had a different level of extroversion and friendliness. The participants studied the profiles and then were asked to imagine the different personae in situations. The researchers found that the different personality traits activated distinct brain regions. Thus, the researchers were able to tell which of the four profiles the participant was thinking about just by looking at the part of the brain activated in the scan.
This is interesting, but it is not particularly surprising. Meanwhile, the secondary headline in the WSJ article proclaims: “Sharp Recall Skills Prove Key to Future Success; Some Excel at ‘Mental Time Travel’. Not so much. The research does not show that. In fact, one theory of memory and cognition, fuzzy trace theory distinguishes between meaning based memory–the so called gist- and more superficial verbatim memory. The theory posits that these memories are encoded separately. It predicts that unconscious gist based intuition often produces better results than reliance on detailed verbatim memory. In other words, sharp memory is often trumped by hazy memory. This is completely at odds with the WSJ graphic.
In future posts, I will dig more into fuzzy trace theory and its primary proponent, Dr. Valerie Reyna.
Ken Hammond created the JDM metatheory dichotomy of coherence and correspondence. Coherence tests decisions on rationality while correspondence tests decisions on empirical accuracy. Coherence advocates start with the mind of the decision maker. In examining rationality of judgment, the main criterion is consistency. Bayes’ theorem is the model for mathematical coherence of decision making. Coherence focuses on justification. It describes departures from “ought” coherence.
David Berreby in his book Us and Them: Understanding Your Tribal Mind illuminates our built in facility to see human kinds. As Kathryn Schulz set out in Being Wrong, “believing things based on paltry evidence is the engine that drives the entire miraculous machinery of human cognition.” We do not care about what is rational, but only care about what is probable. Dividing the world of humans into kinds may give us information, that at least initially, is correct more than it is wrong and thus might be adaptive. Human kinds are processes that are constantly changing. The categories at any point in time, however, seem permanent. Berreby notes that calling them processes does not change the reality of human kinds that people believe in now. As he suggests, the science of human kinds will not undue politics. Berreby points out that in a course of a day, he can pass through many human kinds. He can feel like an American in a group of foreigners, or a New Yorker with a Texan, or a man when talking to a woman. Human kinds like memories are also changing albeit subtly. Our next encounter with a New Yorker may change that stereotype/kind just a little. And every time one makes a statement about a kind, it may be persuasive to me and to those who hear it.
Human kinds, Berreby suggests are symbols that make it easier to make people into objects. We are capable of destroying symbols with ease. Berreby points out another problem: our willingness to see human kind as not only an explanation but also as a cause.
Berreby decries the use of stigma as shame punishment by the legal system. Berreby quotes Martha Nussbaum as to stigma’s inappriateness, since it acts on irrational, unconscious parts of the mind. Enforcement of the law should also avoid human kind judgments. It is not only a terrible idea for society, but it is not the most effective strategy for enforcement.
Science is a better guide even though we cannot eliminate our concepts of human kind. As Berreby reminds us, we still use the terms sunrise and sunset, even though we well know that we are not at the center. Similarly with medicine, miasma theory of disease made some good predictions, but germ theory made better predictions. The science of germ theory did not contradict completely our instincts and traditions, nor did it totally confirm them. Just because race and ethnic groups are real at the level of ethnic groups and culture does not mean they must be real at the level of molecules and genes and organs.