This post is based on the book, Elastic–Flexible Thinking in a Time of Change by Leonard Mlodinow, Pantheon Books, New York, 2018. Mlodinow is a physicist and worked with Stephen Hawking. His previous book Subliminal evidently gave him considerable access to interesting people like Seth MacFarlane. He mentions that Stephen Hawking’s pace of communicating was at best six words a minute with public presentations being done ahead of time. Mlodinow notes that this slowing of the pace of a conversation is actually quite helpful in forcing you to consider the words as opposed to thinking of what you are going to say while the other person is talking so that you can have an instant response.
Every couple of years, I seem to go back and look at “decision making” books that have arrived in my local library. I clearly take a broad view of decision making. This time I came up with Farsighted, Elastic, and the Mind is Flat. The first two books were definitely written to be popular books with the third less so. They share quite a bit. They all rely quite a bit on illustrations or questionnaires that show the peculiarities and shortcomings of our minds. They all rely on literature to explain their cases on how our minds work. Farsighted uses George Eliot and Middlemarch. Elastic uses Jonathan Franzen and mentions his book Corrections. The Mind is Flat uses Leo Tolstoy and Anna Karenina.
This post is inspired by the book: Rebooting AI – Building Artificial Intelligence We Can Trust, written by Gary Marcus and Ernest Davis, New York, 2019. Gary Marcus (see post Kluge) is a well known author and artificial intelligence entrepreneur and Ernest Davis is a professor of computer science at Carnegie Mellon. To oversimplify, the authors emphasize that the successes of AI are narrow and tend to be greedy, opaque, and brittle. They provide history of AI seemingly about being ready for prime time decade after decade after decade. Self driving cars are almost there, but they are not. Human frailties in driving result in a death about every 100,000,000 miles driven, but Marcus and Davis indicate that self driving cars require human intervention every 10,000 miles which is 10,000 times in 100,000,000 miles. It may be a very long time before we are ready to sign off on self-driving cars, because the progress thus far has been the easy part.
David Brooks seems to be a fascination of mine. The New York Times columnist surprises me both in positive and negative ways. I only mention it when the surprise is negative. Below is an excerpt from his November 25, 2016, column.
And this is my problem with the cognitive sciences and the advice world generally. It’s built on the premise that we are chess masters who make decisions, for good or ill. But when it comes to the really major things we mostly follow our noses. What seems interesting, beautiful, curious and addicting?
Have you ever known anybody to turn away from anything they found compulsively engaging?
We don’t decide about life; we’re captured by life. In the major spheres, decision-making, when it happens at all, is downstream from curiosity and engagement. If we really want to understand and shape behavior, maybe we should look less at decision-making and more at curiosity. Why are you interested in the things you are interested in? Why are some people zealously seized, manically attentive and compulsively engaged?
Now that we know a bit more about decision-making, maybe the next frontier is desire. Maybe the next Kahneman and Tversky will help us understand what explains, fires and orders our loves.
I can imagine his frustration with the advice world and maybe with Kahneman and Tversky (see post Prospect Theory), but it appears that Brooks is only looking at the advice world. Brooks would benefit by looking at the work of Ken Hammond. The post Cognitive Continuum examines some of Hammond’s 1980 work. Hammond has those chess masters to whom Brooks refers as one extreme of the cognitive continuum. The post Intuition in J-DM looks at the work of Tilmann Betsch and Andreas Glockner in what is called Parallel Constraint Satisfaction theory.
Betsch and Glockner believe that information integration and output formation (choice, preference) is intuitive. Analysis involves directed search (looking for valid cues or asking an expert for advice), making sense of information, anticipating future events, etc. Thus, they see a judgment as a collaboration of intuition and analysis. The depth of analysis varies, but intuition is always working so preferences are formed even without intention. Limiting processing time and capacity constrains only input. Thus, once information is in the system, intuition will use that information irrespective of amount and capacity.
Curiosity might be considered the degree of dissonance we encounter in our automatic decision making that in effect tells us to analyze–find more information and examine it. We do mostly follow our noses, because it is adaptive. But it is also adaptive to be able to recognize change that is persistent and must be responded to. A parameter of the parallel constraint satisfaction model is the individual sensitivity to differences between cue validities. This implies that individuals respond differently to changing cue validities. Some change quickly when they perceive differences and others change at a glacial pace.
The post Rationality Defined Again: RUN & JUMP looks at the work of Tilmann Betsch and Carsten Held. Brooks in his opinion piece seems to be suggesting that analytic processing is pretty worthless. Betsch and Held have seen this before. They note that research on non-analytic processing has led some authors to conclude that intuition is superior to analysis or to at least promote it as such with the obvious example being Malcolm Gladwell in Blink. Such a notion, however, neglects the important role of decision context. The advantages and disadvantages of the different types of thought depend on the nature of the task. Moreover, the plea for a general superiority of intuition neglects the fact that analysis is capable of things that intuition is not. Consider, for example, the case of routine maintenance and deviation decisions. Routine decisions will lead to good results if prior experiences are representative for the task at hand. In a changing world, however, routines can become obsolete.
In the absence of analytic thought, adapting to changing contexts requires slow, repetitive learning. Upon encountering repeated failure, the individual’s behavioral tendencies will change. The virtue of deliberate analysis, Brooks’ chess mastering, lies in its power to quickly adapt to new situations without necessitating slow reinforcement learning. Whereas intuition is fast and holistic due to parallel processing, it is a slave to the pre-formed structure of knowledge as well as the representation of the decision problem. The relations among goals, situations, options and outcomes that result from prior knowledge provide the structural constraints under which intuitive processes operate. They can work very efficiently but, nevertheless, cannot change these constraint. The potential of analytic thought dwells in the power to change the structure of the representation of a decision problem.
I believe that Brooks realizes that analytic thought is one thing that distinguishes us from other creatures even though it does not seem to inform much of our decision making. The post Embodied(Grounded) prediction(cognition might also open a window for Brooks.
This post tries to do a little tying together on a familiar subject. I look at a couple of papers that provide more perspective than typical research papers provide. First is the preliminary dissertation of Anke Söllner. She provides some educated synthesis which my posts need, but rarely get. Two of her papers which are also part of her dissertation are discussed in the posts Automatic Decision Making and Tool Box or Swiss Army Knife? I also look at a planned special issue of the Journal of Behavioral Decision Making to address “Strategy Selection: A Theoretical and Methodological Challenge.”
Söllner’s work is concerned with the question: which framework–multiple strategy or single strategy– describes multi-attribute decision making best? In multi-attribute decision making we have to choose among two or more options. Cues can be consulted and each cue has some validity in reference to the decision criterion. If the criterion is an objective one (e.g., the quantity of oil), the task is referred to as probabilistic inference, whereas a subjective criterion (e.g., preference for a day trip) characterizes a preferential choice task. The multiple strategy framework is most notably the adaptive toolbox that includes fast and frugal heuristics as individual strategies. Single strategy frameworks assume that instead of selecting one from several distinct decision strategies, decision makers employ the same uniform decision making mechanism in every situation. The single strategy frameworks include the evidence accumulation model and the connectionist parallel constraint satisfaction model.
This post is based on the paper: “Single-process versus multiple-strategy models of decision making: Evidence from an information intrusion paradigm,” written by Anke Söllner, Arndt Bröder, Andreas Glöckner, and Tilmann Betsch. It appeared in Acta Psychologica in January 2014. It is a well done overview of multi-attribute decision models (Multi-attribute decision making deals with preferential choice e.g., “Which dessert do you like better?” and probabilistic inferences e.g., “Which dessert contains more calories?”), along with clever experiments. I am confused that it is single process vs multiple strategy. I would think that it would be process vs. process or strategy vs strategy.
This appears to me to be another polite skirmish in the continuing battle between fast and frugal heuristics and compensatory connectionist models. Do we change strategies or adjust decision thresholds or weights? However, the researchers have moved back to broader frameworks to get a different way to study and attack. This paper has an interesting group of authors. Sollner and Broder wrote a paper last year that looked at similar issues, but focused on the importance of looking separately at how information is acquired and how information is integrated. Glockner and Betsch are prime proponents of parallel constraint satisfaction theory– a single process model that apparently is weak on information acquisition. I will expect a Gigerenzer or Marewski counter move soon for the fast and frugal heuristics side. I should note that there seems to be much respect between those with differing views, and the idea that probably everyone is a little bit wrong and a little bit right seems to pervade.
The idea of bidirectional reasoning seems to have really got going by way of a 1999 paper entitled: “Bidirectional Reasoning in Decision Making by Constraint Satisfaction” written by Keith J. Holyoak and Dan Simon that was published in the Journal of Experimental Psychology. 1999, Vol 128, No. 1, pages 3-31.
This post looks at the paper “Do people learn option or strategy routines in multi-attribute decisions? The answer depends on subtle factors” authored by Arndt Bröder, Andreas Glöckner, Tilmann Betsch, Daniela Link, and Florence Ettlin (Acta Psychologica 143 (2013) 200–209). The researchers note that in their classic book on Einstellung effects, Luchins and Luchins (1959) demonstrated the robustness of maladaptive routinization in problem solving strategies. A specific strategy that had been successful in several trials was still used after changes in the environment that rendered simpler solutions available. Routinization even prevented many participants from finding simple solutions to new problems in which the routinized strategy could not be used. Hence, routinization may be beneficial in a stable task environment, but it may become detrimental in a changing world. It has been demonstrated that even experts fall prey to Einstellung effects, although their magnitude reduces for top-experts.
This post discusses a paper entitled: “Rational decision making: balancing RUN and JUMP modes of analysis,” that appeared in Mind Soc (2012) 11:69–80. It was written by Tilmann Betsch and Carsten Held. I enjoy Tilmann Betsch’s work. This paper combines several ideas in an understandable form. Betsch along with Andreas Glockner have done much to promote the ability of intuition (Intuition in J/DM), but this paper defines rationality as the appropriate balance between the RUN and JUMP modes of analysis.
According to Glockner and Betsch, deliberate constructions (DCs) are the opportunity for the deliberate/analytical system to provide input into decision making. The Parallel Constraint Satisfaction rule holistically considers the information contained in a network. The network consists of all pieces of information that comprise the decision problem (cues, goals, options, evaluations, etc.). In many mundane situations, the constitution of the network does not require any sort of active information search. Relevant features of the environment and currently activated memory entries provide the input to the network. Glockner and Betsch refer to the network installed spontaneously when encountering a decision situation as the primary network. Deliberate processes are activated if the consistency of a resulting mental representation is below a threshold θ. I think this is interesting because it is the intuitive/automatic system requiring coherency of itself. Typically, we think of the deliberate/analytical system as being rational and coherent, while the intuitive system requires correspondence. In this situation, the intuitive/automatic system finds answers that are not consistent, incoherent, and thus seeks input from the deliberate/analytical system.