This post is sooo… derivative, but I cannot help myself. Good judgment is dependent on good information. It has never been so obvious how much we rely on good referees to determine what is good information. Most persuasion is based on filtering the information to the persuader’s advantage, but it has been rare in my lifetime to use the strategy of just hammering the lie.
It is easy to imagine that our paleo brains were rewarded by believing the chief. We both had skin in the game. So our still tribal brains believe things that are repeated over and over, even lies. Unfortunately, our information sources have gotten further and further from us so that our futures are not intertwined, except in an existential way. Our information networks have expanded and more critically selectively expanded.
Emre Soyer and Robin Hogarth have written a new book, The Myth of Experience. Why We Learn the Wrong Lessons, and Ways to Correct Them. This book is aimed at a general audience although it has copious and detailed notes and an index that will allow for deeper looks. I have much respect for their past work both individually and together.
The key idea that they have developed elsewhere is that some learning environments are kind so that what you learn by experience is helpful–say riding a bike– while other environments are wicked and experience cannot be relied upon to make good decisions. Robin Hogarth’s Educating Intuition develops this (See posts: What has Brunswik’s Lens Model Taught? ‘ , Kind and Wicked Learning Environments)
Nick Chater is the author of The Mind is Flat–the Remarkable Shallowness of the Improvising Brain, Yale University Press, New Haven, 2019. He is a professor of behavioral science at the Warwick Business School. The book is two parts and overall it is as ambitious as it is simple. The first part is the most convincing. He shows how misguided we are on our perceptions, emotions, and decision making. Our vision seems to provide us with a full fledged model of our environment, when we really only can focus on a very small area with our furtive eye movements providing the impression of a complete detailed picture. Our emotions do not well up from deep inside, but are the results of in-the-moment interpretations based on the situation we are in, and highly ambiguous evidence from our own bodily state. Chater sees our beliefs, desires, and hopes as just as much inventions as our favorite fictional characters. Introspection does not work, because there is nothing to look at. We are imaginative creatures with minds that pretty much do everything on the fly. We improvise so our decision making is inconsistent as are our preferences.
This post is based on the book, Elastic–Flexible Thinking in a Time of Change by Leonard Mlodinow, Pantheon Books, New York, 2018. Mlodinow is a physicist and worked with Stephen Hawking. His previous book Subliminal evidently gave him considerable access to interesting people like Seth MacFarlane. He mentions that Stephen Hawking’s pace of communicating was at best six words a minute with public presentations being done ahead of time. Mlodinow notes that this slowing of the pace of a conversation is actually quite helpful in forcing you to consider the words as opposed to thinking of what you are going to say while the other person is talking so that you can have an instant response.
Every couple of years, I seem to go back and look at “decision making” books that have arrived in my local library. I clearly take a broad view of decision making. This time I came up with Farsighted, Elastic, and the Mind is Flat. The first two books were definitely written to be popular books with the third less so. They share quite a bit. They all rely quite a bit on illustrations or questionnaires that show the peculiarities and shortcomings of our minds. They all rely on literature to explain their cases on how our minds work. Farsighted uses George Eliot and Middlemarch. Elastic uses Jonathan Franzen and mentions his book Corrections. The Mind is Flat uses Leo Tolstoy and Anna Karenina.
This book, Nervous States – Democracy and the Decline of Reason, 2019, written by William Davies tries to explain the state we are in. The end of truth or the domination of feelings or the end of expertise all come to mind. People perceive that change is so fast that the slow knowledge developed by reason and learning is devalued, while instant knowledge that will be worthless tomorrow like that used by commodity, bond or stock trading networks is highly valued. Davies builds on Hayek and says many things that ring true. In three posts, I will present the main points of Davies’ book, argue with some of the points, and present what Davies says we can do about it. Devaluing reason is a big deal for decision making.
This post is inspired by The Attention Merchants, Tim Wu, Vintage Books, 2017, New York. Decision making is not a front line issue in the book, but it is also clear that we cannot control our decision making if we cannot control our attention. The book begins as a history of what has grabbed our attention from newspapers and posters to radio to television to computers and video games, to the internet and its vehicles including our present attention grabber, the cell phone. Of course, each attention platform has ultimately had to make money and advertising has been the dominant path chosen. Advertising is the villain only to the extent that it puts able resources into effectively capturing our attention. But we do not check our email so often or play video games so long due to advertising. There is definitely some behavioral conditioning going on. Wu mentions that video games can even: “induce a ‘flow state’, that form of contentment, of optimal experience, described by the cognitive scientist Mihaly Csikszentimihalyi, in which people feel ‘strong, alert, if effortless control, unselfconscious, and at the peak of their abilities.”
In Confidence, Part II, the authors conclude that confidence is computed continuously, online, throughout the decision making process, thus lending support to models of the mind as a device that computes with probabilistic estimates and probability distributions.
The Embodied Mind
One such explanation is that of predictive processing/embodied mind. Andy Clark, Jacob Hohwy, and Karl Friston have all helped to weave together this concept. Our minds are blends of top down and bottom up processing where error messages and the effort to fix those errors makes it possible for us to engage the world. According to the embodied mind model, our minds do not just reside in our heads. Our bodies determine how we interact with the world and how we shape our world so that we can predict better. Our evolutionary limitations have much to do with how our minds work. One example provided by Andy Clark and Barbara Webb is a robot without any brain imitating human walking nearly perfectly (video go to 2:40). Now how does this tie into confidence? Confidence at a conscious level is the extent of our belief that our decisions are correct. But the same thing is going on as a fundamental part of perception and action. Estimating the certainty of our own prediction error signals of our own mental states and processes is as Clark notes: “clearly a delicate and tricky business. For it is the prediction error signal that…gets to ‘carry the news’.”
Now the confidence heuristic is not the only thing Trump takes advantage of, but we will leave those for another time. I will also avoid the question of whether or not Trump is actually confident. So what is the relationship of confidence and decision making? Daniel Kahneman in Thinking, Fast and Slow on page 13 describes:
a puzzling limitation of our mind: our excessive confidence in what we believe we know, and our apparent inability to acknowledge the full extent of our ignorance and the uncertainty of the world we live in. We are prone to overestimate how much we understand about the world and to underestimate the role of chance in events. Overconfidence is fed by the illusory certainty of hindsight.