Tag Archives: Gigerenzer

Medical Decisions–Risk Saavy

screeningLearnMoreThis post looks at the medical/health component of decision making as addressed in Gerd Gigerenzer’s new book, Risk Saavy, How to Make Good Decisions. First, Gigerenzer has contributed greatly to improving health decision making. This blog includes three consecutive posts on the Statistics of Health Decision Making based on Gigerenzer’s work.

He points out both the weaknesses of screening tests and our understanding of the results. We have to overcome our tendency to see linear relationships when they are nonlinear. Doctors are no different. The classic problem is an imperfect screening test for a relatively rare disease. You cannot think in fractions or percentages. You must think in absolute frequencies. Breast cancer screening is one example. Generally, it can catch about 90% of breast cancers and only about 9% test positive who do not have breast cancer. So if you have a positive test, that means chances are you have breast cancer. No! You cannot let your intuition get involved especially when the disease is more rare than the test’s mistakes. If we assume that 10 out of 1000 women have breast cancer, then 90% or 9 will be detected, but about 90 of the 1000 women will test positive who do not have disease. Thus only 9 of the 99 who test positive actually have breast cancer. I know this, but give me a new disease or a slightly different scenario and let a month pass, I will still be tempted to shortcut the absolute frequencies and get it wrong.

Continue reading

Checklists/Fast and Frugal Trees

figure.73The checklist is a heuristic. Gigerenzer explains that there needs to be something between mere intuition and complex calculations, and those might often be called rules of thumb.  Although a checklist can be many things, it also fits between mere intuition and a bunch of analytic reasoning. The best checklists are like Gigerenzer’s fast and frugal tree where you take the best of a yes or no question starting with the most important question and work your way down the tree to the decision.  Gigerenzer talks about “ecological rationality”–the match between the structure of a heuristic and the structure of an environment.

Continue reading

On the Use of Recognition in Inferential Decision Making

 

sanantonio450This post and the next post are based on Rudiger Pohl’s article, “On the Use of Recognition in Inferential Decision Making” that appeared in the Journal of Judgment and Decision Making in 2011.  The Journal had three issues devoted to recognition.  Pohl provides the best summary and is also the last.  I found that some of the articles with two or three authors trying to come up with a summary failed apparently because there was so much disagreement among the authors.

“Intuition is nothing more or less than recognition.”  Daniel Kahneman delivers this and credits Simon in Thinking Fast and Thinking Slow.  Pohl’s article does not address this statement, but it helps me address it. Maybe the statement is not making intuition simpler, but making recognition much more complicated.

Continue reading

Cumulative Prospect Theory-Changing Parameters

kahnemanimages1This post looks at a paper by Andreas Glockner and Thorsten Pachur entitled:  “Cognitive models of risky choice: Parameter stability and predictive accuracy of prospect theory,” that appeared in Cognition in 2012.   The paper looks at the changeable parameters in prospect theory and tries to determine their explanatory value and also the extent which individuals have stable parameters.  It also tests a number of heuristics along with expected value and expected utility theory by studying the responses of 66 college students at the University of Bonn.

Continue reading

The Statistics of Health Decision Making-Therapy

stat2imagesGigerenzer says that we must teach risk literacy in medical school and statistical literacy to all in primary school. He and his colleagues go into considerable detail to say how this should be done. Teaching statistics early is not sufficient. It is also essential to represent probabilistic information in forms that the human mind can grasp. To this end, visual and hands-on material can enable a playful development of statistical thinking. For instance, tinker-cubes are lego-like units that first graders can use to represent simple events, to combine to represent joint events, and to count to determine conditional frequencies.

Continue reading

The Statistics of Health Decision Making- Causes and Consequences of Illiteracy

statimagesGerd Gigerenzer has produced much work that is relevant and easy to understand for the laymen.  Gigerenzer has defended the effectiveness and our abilities to use heuristics.  He has also tirelessly promoted statistics education and transparent presentation of statistical information.  This post tries to summarize some unique and interesting material with respect to the causes and consequences of our statistical illiteracy.

Gerd Gigerenzer along with Wolfgang Gaissmaier, Elke Kurz-Milcke, Lisa M. Schwartz, and Steven Woloshin suggest that the long opposition to health statistics can be traced back to the struggle between three 19th-century visions of the physician: artist, determinist, or statistician.  They argue that these professional ideals go hand in hand with patients’ corresponding ideals, which even today determine the mixture of feelings about health statistics: The artist embodies paternalism and requests blind trust from the patient, the determinist strives for perfect knowledge of causes and invites the illusion of certainty in patients, and the statistician relies on facts rather than medical charisma, paving the way for shared decision making. The first two ideals effectively deter interest in health statistics.

Continue reading

The Statistics of Health Decision Making-Statistical Illiteracy

statanalysis_03This will be the first post of a three-part series. They will all be based on a 2008 monograph that appeared in Psychological Science in the Public Interest, entitled:  “Helping Doctors and Patients Make Sense of Health Statistics.” Gerd Gigerenzer was the primary author.  I have started posts on numeracy, especially health numeracy, based on journal articles, but they tended to be focused on the public’s generally poor numeracy skills.  This always stopped me in my tracks, because it left out the doctors who have skills much like the public, and it left out the researchers, screening test providers, drug makers, and device makers who tend to have excellent skills.  They sometimes use those skills to take advantage of the doctors, patients, and the public(for my purposes journalists are just part of the public).  This monograph puts  the pieces together.

Continue reading

Simple Heuristics at Work in the World

imitationThis post is based on Chapter 10 of Bounded Rationality and on Gut Feelings both authored primarily by Gerd Gigerenzer. It is largely a shopping list of heuristics or rules of thumb. Gigerenzer makes the important point that more information and more choice are not always better. Less is more under certain conditions:

  1. A beneficial degree of ignorance–This seems more like a coincidence to me, but Gigerenzer includes it.
  2. Unconscious motor skills–Over deliberation can mess up skills.
  3. Cognitive limitations–Our brains can benefit by doing such things as forgetting so we do not have too much information.-Gary Marcus in Kluge would not agree or at least would not be so proud of our forgetting.
  4. Freedom of choice paradox–At some point more options create conflicts that make it more difficult to compare options. Betsch and Glockner point out as I have discussed in Intuition in J/DM that even intuition is slowed by conflicts in the data.
  5. The benefits of simplicity–In a complex world, simple rules can be better than complex rules.
  6. Information costs–Extracting too much information can hurt trust.

Continue reading

Role of Culture in Bounded Rationality

golden-naped-barbetThis post examines Chapter 19 of Bounded Rationality.
That chapter looks at three ways  that cultural processes produce boundedly rational heuristics/algorithms. They are:

  1. Simple imitation and learning heuristics
  2. Over  cultural evolutionary time scale, the aids in number one have given rise to complex motivations, rules, cues, etc.
  3. From these first two, group processes that distribute cognition, knowledge, skill, and labor have arisen.

Continue reading

Bounded Rationality: The Adaptive Toolbox

2012-11-30 13.16.23-1Gerd Gigerenzer and Reinhard Selten provide a good summary of this topic in “Rethinking Rationality.” This is the introductory chapter to Bounded Rationality, which they edited. To connect the rational and the psychological, bounded rationality includes emotions, social norms, imitation, and other cultural tools. The initial modern definition of rationality was created in about 1654 by Blaise Pascal.  That definition was simply to maximize expected value.  Blaise Pascal saw it as creating a new form of morality as well–replace faith with moral expectation.  That definition ran into trouble with the St. Petersburg paradox, and Allais, and Ellsberg paradoxes. So David Bernouilli redefined rationality as maximizing expected utility.  Bounded rationality was created by Herb Simon in the 1950s.  Optimization was not part of this new rationality.  The metaphor for bounded rationality was a pair of scissors, one blade is “cognitive limitations” and the other is the “structure of the environment.” In Herb Simon’s words written in 1956: “a great deal can be learned about rational decision making…by taking account of the fact that the environments to which it must adapt possess properties that permit further simplification of its choice mechanisms”.  Gigerenzer argues that simple and robust heuristics can match a specific optimizing strategy.  Different researchers see “bounded rationality” differently.  Gigerenzer says it is not optimization nor irrationality nor is it optimization under constraints which includes limited search, but then requires cost/benefit of further search.  Gigerenzer’s model of bounded rationality uses fast and frugal stopping rules (when to stop search) that do not involve optimization.  Kahneman and others have pointed out humans falling short of rationality in such things as the base rate fallacy and conjunction fallacy.  Gigerenzer & Selten contend that these fallacies are “based on norms that have been put forth without analzying the structure of the environments… Moreover, when information is presented in natural frequencies rather than probabilities, base rate neglect is perfectly rational.”

Continue reading