Category Archives: Uncertainty

A Nice Surprise

This post is based on a paper written by Andy Clark, author of Surfing Uncertainty (See Paper Predictive Processing for a fuller treatment.),A nice surprise? Predictive processing and the active pursuit of novelty,”  that appeared in Phenomenology and the Cognitive Sciences, pp. 1-14. DOI: 10.1007/s11097-017-9525-z. For me this is a chance to learn how Andy Clark has polished up his arguments since his book.  It also strikes me as connected to my recent posts on Curiosity and Creativity.

Clark and Friston (See post The Prediction Machine) depict human brains as devices that minimize prediction error signals: signals that encode the difference between actual and expected sensory simulations. But we know that we are attracted to the unexpected. We humans often seem to actively seek out surprising events, deliberately seeking novel and exciting streams of sensory stimulation. So how does that square with the idea of minimizing prediction error.

Continue reading

Denver Bullet Study

This post is largely a continuation of the Kenneth R Hammond post, but one prompted by recent current events. My opinion on gun control is probably readily apparent. But if it is not, let me say that I go crazy when mental health is bandied about as the reason for our school shootings or when we hear that  arming teachers is a solution to anything. However,  going crazy or questioning the sincerity of people with whom you are arguing is not a good idea. Dan Kahan (See my posts Cultural Cognition or Curiosity or his blog Cultural Cognition) has some great ideas on this, but Ken Hammond actually had accomplishments and they could help guide all of us today. I should note also that I was unable to quickly find the original sources so I am relying completely on: “Kenneth R. Hammond’s contributions to the study of judgment and decision making,” written by Mandeep K. Dhami and Jeryl L. Mumpower that appeared in  Judgment and Decision Making, Vol. 13, No. 1, January 2018, pp. 1–22.

Continue reading

Dark Room Problem- Minimizing Surprise

dark_roomThis post is based on the paper: “Free-energy minimization and the dark-room problem,” written by Karl Friston, Christopher Thornton and Andy Clark that appeared in Frontiers in Psychology in May 2012. Recent years have seen the emergence of an important new fundamental theory of brain function (Posts Embodied Prediction and Prediction Error Minimization). This theory brings information-theoretic, Bayesian, neuroscientific, and machine learning approaches into a single framework whose over arching principle is the minimization of surprise (or, equivalently, the maximization of expectation). A puzzle raised by critics of these models is that biological systems do not seem to avoid surprises. People do not simply seek a dark, unchanging chamber, and stay there. This is the “Dark-Room Problem.”

Continue reading

Aging and Decisions from Experience

homerThis post is based on the paper: “The role of cognitive abilities in decisions from experience: Age differences emerge as a function of choice set size,” by Renato Frey, Rui Mata,  and Ralph Hertwig that appeared in Cognition 142 (2015) 60–80.

People seldom enjoy access to summarized information about risky options before making
a decision except for things like weather forecasts that explicitly state a probability. Instead, they may search for information and learn from the environment—thus making decisions from experience. Many consequential decisions—including health care choices, finances, and everyday risks (e.g., driving in bad weather; crossing a busy street)—are made without full knowledge of the possible outcomes and their probabilities so we must make decisions from experience. According to the authors, the mind’s most notable transformation across the life span is a substantial decline in processing speed, working memory and short-term memory capacity —all components potentially involved in search and learning processes.

Continue reading

Risk Reward Heuristic

riskrewardindexThis post is based on a paper: “Ecologically Rational Choice and the Structure of the Environment”, that appeared in the Journal of Experimental Psychology: 2014, Vol. 143, No. 5. The authors are Timothy J. Pleskac and Ralph Hertwig. The paper is based on the idea that decision making theory has largely ignored the idea that risk and reward are tied together with payoff magnitudes signaling their probabilities.

How people should and do deal  with uncertainty is one of the most vexing problems in theorizing about choice. The researchers suggests a process that is inferential in nature and rests on the notion that probabilities can be approximated from statistical regularities that govern real-world gambles. In the environment there are typically multiple fallible indicators to guide your way. When some cues become unreliable or unavailable, the organism can exploit this redundancy by substituting or alternating between different cues. This is possible because of what Brunswik called the mutual substitutability or vicarious functioning of cues. It is these properties of intercue relationships and substitutability that Pleskac and Hertwig suggest offer a new perspective on how people make decisions under uncertainty. Under uncertainty, cues such as the payoffs associated with different courses of actions may be accessible, whereas other cues—in this case, the probability with which those payoffs occur—are not. This missing probability information has been problematic for choice theories as typically both payoffs and probabilities are used in determining the value of options and in choosing. However, if payoffs and probabilities are interrelated, then this ecological property can permit the decision maker to infer hidden or unknown probability distributions from the payoffs themselves, thus easing the problem of making decisions under uncertainty.

Continue reading

Recognition heuristic and cognitive effort

2015-04-22 14.27.57Having the good fortune to be lost in Venice, I was reminded of the nuances of the recognition heuristic. My wife found the perfect antique store which was unsurprisingly closed for lunch. We went on, but a couple of hours later we tried to recreate our steps. For much of the journey, we did well having only to recognize that we had seen a particular store or archway or bridge before. Unfortunately, that broke down when we realized that we were retracing our steps in a five minute period. We still remembered that walk to the restaurant the first night, but it was unfortunately not differentiated in our minds. This was certainly an example of less knowledge being more. Eventually, using a combination of GPS and maps we found our way back to our hotel, but we never did find that antique store. And I was trying.

This post is based on a paper by Benjamin Hilbig, Martha Michalkiewicz, Marta Castela, Rudiger Pohl and Edgar Erdfelder: “Whatever the cost? Information integration in memory-based inferences depends on cognitive effort.” that was scheduled to appear in Memory and Cognition 2014. Fundamental uncertainty is a not uncommon situation for our decision making. There is an ongoing argument with the fast and frugal heuristics toolbox approach and the single tool approaches of evidence accumulation and parallel constraint satisfaction. However that argument depends on the particular task, the type of task, and on and on. I am still waiting for a giant table that puts all those things together.

Continue reading

Medical Decisions–Risk Saavy

screeningLearnMoreThis post looks at the medical/health component of decision making as addressed in Gerd Gigerenzer’s new book, Risk Saavy, How to Make Good Decisions. First, Gigerenzer has contributed greatly to improving health decision making. This blog includes three consecutive posts on the Statistics of Health Decision Making based on Gigerenzer’s work.

He points out both the weaknesses of screening tests and our understanding of the results. We have to overcome our tendency to see linear relationships when they are nonlinear. Doctors are no different. The classic problem is an imperfect screening test for a relatively rare disease. You cannot think in fractions or percentages. You must think in absolute frequencies. Breast cancer screening is one example. Generally, it can catch about 90% of breast cancers and only about 9% test positive who do not have breast cancer. So if you have a positive test, that means chances are you have breast cancer. No! You cannot let your intuition get involved especially when the disease is more rare than the test’s mistakes. If we assume that 10 out of 1000 women have breast cancer, then 90% or 9 will be detected, but about 90 of the 1000 women will test positive who do not have disease. Thus only 9 of the 99 who test positive actually have breast cancer. I know this, but give me a new disease or a slightly different scenario and let a month pass, I will still be tempted to shortcut the absolute frequencies and get it wrong.

Continue reading

Muddle

mindless-muddle-1374093328-6317I first read Lindblom’s paper, “The Science of Muddling Through,” in a comparative systems political science class. It appealed to me then and after more than forty years, it still does. He did a good job of exposing the logical extreme of the rational model as ridiculous, at least in government. At the same time, he used terminology for his incremental model that made it difficult to publicly embrace. As a city planner, I could decry “disjointed incrementalism,” to try to get elected officials to look a bit further into the future, but had he called it coherent accumulation, I would not have ever had a chance. After fifty five years, many of his examples still seem quite relevant.

Continue reading

Strategic Decison Models for Public Decision Making

defFigure-26.3-hiThis post is based on a July 2009 paper, “Strategic Decision Making Paradigms: a Primer for Senior Leaders,” that was written by Col. Charles D. Allen and Dr. Breena E. Coates both of the Army War College. Although much in the paper has been touched upon in prior posts, it summarizes several models for strategic decision making, and includes some with which I am unfamiliar. It is public sector oriented it includes some good examples related to the defense of the nation. It also sets the stage for another post on muddling through. Strategic decisions entail “ill-structured, “messy” or “wicked problems” that do not have quick, easy solutions. They often end in so called “error of the third kind”, where complex problems are often addressed with a correct solution to the wrong problem.

Continue reading

Decision Delay

delayindexThis post is a continuation of the theme of when decisions are made and how we delay or wait or decide not to decide.  This post is based on a 2014 paper by Teichert, Ferrera, and Grinband, “Humans Optimize Decision-Making by Delaying Decision Onset,” in PLoS ONE. Again, this paper is beyond my understanding at least as to the details. It has some excellent figures and graphics that are pretty, but I do not think that I really understand them. These are my shortcomings. What interests me about this is the contrast with my previous post Deciding not to Decide.   This paper examines decision onset and nondecision time while “Deciding not to Decide” suggested an explicit decision to inhibit the decision. I find making an inhibitory decision a more satisfying explanation than delaying decision onset, although they could be the same thing or the situations may be so different that there is no real comparison.

Continue reading