I discovered that I was a celiac a few months ago and accordingly I am on a gluten free diet. Compared to most conditions discovered in one’s late sixties, celiac disease seems almost inconsequential. However, it fits into the idea of prediction error minimization. In effect, the environment has changed and I need to change my predictions. Bread and beer are now bad. My automatic, intuitive prediction machine has not been getting it right. It is disorienting. I can no longer “See food, eat food.” I can change the environment at home, but in the wider world I need to be aware. My brain needs to dedicate perpetual, and at least for now, conscious effort to this cause. It is almost as if I became instantly even dumber. It makes me more self absorbed in social settings that involve food. Not known for my social skills, I have been a good listener, but now not so much. On my Dad’s 94th birthday, I ate a big piece of German chocolate cake, enjoyed it thoroughly, and then remembered that it was not allowed. In my particular case, I do not get sick or nauseated when I make such a mistake so my commitment is always under threat. This demands an even larger share of my brain to be compliant. My main incentive to comply is those photos of my scalloped small intestine. I note that I was diagnosed after years of trying to figure out my low ferritin levels. (It will be extremely disappointing if I find that my ferritin is still low.) Continue reading
This post is largely a continuation of the Kenneth R Hammond post, but one prompted by recent current events. My opinion on gun control is probably readily apparent. But if it is not, let me say that I go crazy when mental health is bandied about as the reason for our school shootings or when we hear that arming teachers is a solution to anything. However, going crazy or questioning the sincerity of people with whom you are arguing is not a good idea. Dan Kahan (See my posts Cultural Cognition or Curiosity or his blog Cultural Cognition) has some great ideas on this, but Ken Hammond actually had accomplishments and they could help guide all of us today. I should note also that I was unable to quickly find the original sources so I am relying completely on: “Kenneth R. Hammond’s contributions to the study of judgment and decision making,” written by Mandeep K. Dhami and Jeryl L. Mumpower that appeared in Judgment and Decision Making, Vol. 13, No. 1, January 2018, pp. 1–22.
Although I have had much respect for Dan Kahan’s work, I have had a little trouble with the Identity protective Cognition Thesis (ICT). The portion in bold in the quote below from “Motivated Numeracy and Enlightened Self-Government” has never rung true.
On matters like climate change, nuclear waste disposal, the financing of economic stimulus programs, and the like, an ordinary citizen pays no price for forming a perception of fact that is contrary to the best available empirical evidence: That individual’s personal beliefs and related actions—as consumer, voter, or public discussant—are too inconsequential to affect the level of risk he or anyone else faces or the outcome of any public policy debate. However, if he gets the ‘wrong answer” in relation to the one that is expected of members of his affinity group, the impact could be devastating: the loss of trust among peers, stigmatization within his community, and even the loss of economic opportunities.
Why should Thanksgiving be so painful if it were true? I do not even know what my friends think of these things. Now at some point issues like climate change become so politically tainted that you may avoid talking about them to not antagonize your friends, but that does not change my view. But now Kahan has a better explanation.
This post is a look at the book by Philip E Tetlock and Dan Gardner, Superforecasting– the Art and Science of Prediction. Phil Tetlock is also the author of Expert Political Judgment: How Good Is It? How Can We Know? In Superforecasting Tetlock blends discussion of the largely popular literature on decision making and his long duration scientific work on the ability of experts and others to predict future events.
In Expert Political Judgment: How Good Is It? How Can We Know? Tetlock found that the average expert did little better than guessing. He also found that some did better. In Superforecasting he discusses the study of those who did better and how they did it.
This post brings up the latest paper by Dan Kahan and his colleagues, Erica Dawson, Ellen Peters, and Paul Slovic: “Motivated Numeracy and Enlightened Self-Government,” Cultural Cognition Project, Working Paper No. 116. This paper strengthens the already strong arguments.
The experiment that was the subject of this paper was designed to test two opposing accounts of conflict over decision relevant science. The first—the Science Comprehension Thesis (“SCT”)—attributes such conflicts to the limited capacity of the public to understand the significance of valid empirical evidence. The second—the Identity-protective Cognition Thesis (“ICT”)—sees a particular recurring form of group conflict as disabling the capacities that individuals have to make sense of decision-relevant science: when policy-relevant facts become identified as symbols of membership in and loyalty to affinity groups that figure in important ways in individuals’ lives, they will be motivated to engage empirical evidence and other information in a manner that more reliably connects their beliefs to the positions that predominate in their particular groups than to the positions that are best supported by the evidence.
Dan Kahan has an article in the October 2013, issue of Science, “A Risky Science Communication
Environment for Vaccines,” with a specific example of the HPV vaccine issues. Kahan has written a good article and one that may have not pleased several people. It fits together with my post Web 2.0 for Vaccination Decisions. Kahan makes the case for having scientifically based scientific risk communication strategies which is something that Betsch and Reyna et al try to do in “Opportunities and challenges of Web 2.0 for vaccination decisions.” Kahan may make a bigger case for just not being stupid and ignoring everything we already know about risk communication. He says quite well although indirectly that vaccination is really not a cultural cognition issue–yet, but we could make it one if we are not careful.
Cultural cognition has grown from the ideas of Mary Douglas and Aaron Wildavsky. The research, ideas, etc of the Cultural Cognition Project at Yale Law School are at www.culturalcognition.net – home . Dan Kahan leads the project. The website is a great resource and there is little reason to provide much here other than to try to get you to visit it. I will first provide the basic concept as set out by Dan Kahan and Don Braman. Then I will provide quick summaries of the 2012 paper by Kahan, Peters, Wittlin, Slovic, Larrimore Ouellette, Braman & Mandel, “The polarizing impact of science literacy and numeracy on perceived climate change risks” and Kahan’s 2012 paper, “Ideology, Motivated Reasoning, and Cognitive Reflection: An Experimental Study.” They are rather remarkable.