Category Archives: Parallel constraint satisfaction

Parallel Constraint Satisfaction Model updated with Learning

This post is based on a draft dated July 10, 2015, “Learning in Dynamic Probabilistic Environments: A Parallel-constraint Satisfaction Network-model Approach,” written by Marc Jekel, Andreas Glöckner, & Arndt Bröder. The paper includes experiments that contrast Parallel Constraint Satisfaction with the Adaptive Toolbox Approach. I have chosen to look only at the update of the PCS model with learning. The authors develop an integrative model for decision making and learning by extending previous work on parallel constraint satisfaction networks with algorithms of backward error-propagation learning. The Parallel Constraint Satisfaction Theory for Decision Making and Learning (PCS-DM-L) conceptualizes decision making as process of coherence structuring in which learning is achieved by adjusting network weights from one decision to the next. PCS-DM-L predicts that individuals adapt to the environment by gradual changes in cue weighting.

Continue reading

David Brooks- Innocent of Cognitive Continuum

heartindexDavid Brooks seems to be a fascination of mine.  The New York Times columnist surprises me both in positive and negative ways. I only mention it when the surprise is negative. Below is an excerpt from his November 25, 2016, column.

And this is my problem with the cognitive sciences and the advice world generally. It’s built on the premise that we are chess masters who make decisions, for good or ill. But when it comes to the really major things we mostly follow our noses. What seems interesting, beautiful, curious and addicting?

Have you ever known anybody to turn away from anything they found compulsively engaging?

We don’t decide about life; we’re captured by life. In the major spheres, decision-making, when it happens at all, is downstream from curiosity and engagement. If we really want to understand and shape behavior, maybe we should look less at decision-making and more at curiosity. Why are you interested in the things you are interested in? Why are some people zealously seized, manically attentive and compulsively engaged?

Now that we know a bit more about decision-making, maybe the next frontier is desire. Maybe the next Kahneman and Tversky will help us understand what explains, fires and orders our loves.

I can imagine his frustration with the advice world and maybe with Kahneman and Tversky (see post Prospect Theory), but it appears that Brooks is only looking at the advice world. Brooks would benefit by looking at the work of Ken Hammond. The post Cognitive Continuum examines some of Hammond’s 1980 work. Hammond has those chess masters to whom Brooks refers as one extreme of the cognitive continuum.  The post Intuition in J-DM   looks at the work of Tilmann Betsch and Andreas Glockner in what is called Parallel Constraint Satisfaction theory.

Betsch and Glockner believe that information integration and output formation (choice, preference) is intuitive.  Analysis involves directed search (looking for valid cues or asking an expert for advice), making sense of information, anticipating future events, etc.  Thus, they see a judgment as a collaboration of intuition and analysis. The depth of analysis varies, but intuition is always working so preferences are formed even without intention.  Limiting processing time and capacity constrains only input.  Thus, once information is in the system, intuition will use that information irrespective of amount and capacity.

Curiosity might be considered the degree of dissonance we encounter in our automatic decision making that in effect tells us to analyze–find more information and examine it.  We do mostly follow our noses, because it is adaptive. But it is also adaptive to be able to recognize change that is persistent and must be responded to. A parameter of the parallel constraint satisfaction model is the individual sensitivity to differences between cue validities. This implies that individuals respond differently to changing cue validities. Some change quickly when they perceive differences and others change at a glacial pace.

The post Rationality Defined Again:  RUN & JUMP  looks at the work of Tilmann Betsch and Carsten Held. Brooks in his opinion piece seems to be suggesting that analytic processing is pretty worthless. Betsch and Held have seen this before. They note that research on non-analytic processing has led some authors to conclude that intuition is superior to analysis or to at least promote it as such with the obvious example being Malcolm Gladwell in Blink.  Such a notion, however, neglects the important role of decision context. The advantages and disadvantages of the different types of thought depend on the nature of the task. Moreover, the plea for a general superiority of intuition neglects the fact that analysis is capable of things that intuition is not. Consider, for example, the case of routine maintenance and deviation decisions. Routine decisions will lead to good results if prior experiences are representative for the task at hand. In a changing world, however, routines can become obsolete.

In the absence of analytic thought, adapting to changing contexts requires slow, repetitive learning. Upon encountering repeated failure, the individual’s behavioral tendencies will change. The virtue of deliberate analysis, Brooks’ chess mastering, lies in its power to quickly adapt to new situations without necessitating slow reinforcement learning. Whereas intuition is fast and holistic due to parallel processing, it is a slave to the pre-formed structure of knowledge as well as the representation of the decision problem. The relations among goals, situations, options and outcomes that result from prior  knowledge provide the structural constraints under which intuitive processes operate. They can work very efficiently but, nevertheless, cannot change these constraint. The potential of analytic thought dwells in the power to change the structure of the representation of a decision problem.

I believe that Brooks realizes that analytic thought is one thing that distinguishes us from other creatures even though it does not seem to inform much of our decision making. The post Embodied(Grounded) prediction(cognition  might also open a window for Brooks.

Single Strategy Framework and the Process of Changing Weights


cloudindexThis post starts from the conclusion of the previous post that the evidence supports a single strategy framework, looks at Julian Marewski’s criticism, and then piles on with ideas on how weights can be changed in a single strategy framework.

Marewski provided a paper for the special issue of the Journal of Applied Research in Memory and Cognition (2015)  on “Modeling and Aiding Intuition in Organizational Decision Making”:  “Unveiling the Lady in Black: Modeling and Aiding Intuition,” authored by Ulrich Hoffrage and Julian N. Marewski. The paper gives the parallel constraint satisfaction model a not so subtle knock:

By exaggerating and simplifying features or traits, caricatures can aid perceiving the real thing. In reality, both magic costumes and chastity belts are degrees on a continuum. In fact, many theories are neither solely formal or verbal. Glöckner and Betsch’s connectionist model of intuitive decision making, for instance, explicitly rests on both math and verbal assumptions. Indeed, on its own, theorizing at formal or informal levels is neither “good” nor “bad”. Clearly, both levels of description have their own merits and, actually, also their own problems. Both can be interesting, informative, and insightful – like the work presented in the first three papers of this special issue, which we hope you enjoy as much as we do. And both can border re-description and tautology. This can happen when a theory does not attempt to model processes. Examples are mathematical equations with free parameters that carry no explanatory value, but that are given quasi-psychological, marketable labels (e.g., “risk aversion”).

Continue reading

Strategy Selection — Single or Multiple?

spannerindexThis post tries to do a little tying together on a familiar subject. I look at a couple of papers that provide more perspective than typical research papers provide. First is the preliminary dissertation of Anke Söllner. She provides some educated synthesis which my posts need, but rarely get. Two of her papers which are also part of her dissertation are discussed in the posts Automatic Decision Making and Tool Box or Swiss Army Knife? I also look at a planned special issue of the Journal of Behavioral Decision Making to address “Strategy Selection: A Theoretical and Methodological Challenge.”

Söllner’s work is concerned with the question:  which framework–multiple strategy or single strategy– describes multi-attribute decision making best? In multi-attribute decision making we have to choose among two or more options. Cues can be consulted and each cue has some validity in reference to the decision criterion. If the criterion is an objective one (e.g., the quantity of oil), the task is referred to as probabilistic inference, whereas a subjective criterion (e.g., preference for a day trip) characterizes a preferential choice task. The multiple strategy framework is most notably the adaptive toolbox that includes fast and frugal heuristics as individual strategies. Single strategy frameworks assume that instead of selecting one from several distinct decision strategies, decision makers employ the same uniform decision making mechanism in every situation. The single strategy frameworks include the evidence accumulation model and the connectionist parallel constraint satisfaction model.

Continue reading

Coherence from the default mode network

coherenbrainThis post starts with the paper “Brains striving for coherence: Long-term cumulative plot formation in the default mode network,” authored by K. Tylén, P. Christensen, A. Roepstorff, T. Lund, S. Østergaard, and M. Donald. The paper appeared in NeuroImage 121 (2015) 106–114.

People are  capable of navigating and keeping track of all the parallel social activities of everyday life even when confronted with interruptions or changes in the environment. Tylen et al suggest that even though these situations present themselves in series of interrupted segments often scattered over huge time periods, they tend to constitute perfectly well-formed and coherent experiences in conscious memory. However, the underlying  mechanisms of such long-term integration is not well understood. While brain activity is generally traceable within the short time frame of working memory, these integrative processes last for minutes, hours or even days.

Continue reading

Big Models

pope155570_600After the three years that I have pushed out other people’s ideas on judgement and decision making, at this moment, I can recall three huge ideas.

I continually look for commment on and expansion of these ideas, and I often do this in the most lazy of ways, I google them.  Recently I seemed to find the last two mentioned on the same page of a philosophy book. That was not actually true, but it did remind me of similarities that I could point out. The idea of a compensatory process where one changes his belief a little to match the current set of “facts” tracks well with the idea that we can get predictions correct by moving our hand to catch the ball so that it does not have to be thrown perfectly. Both clearly try to match up the environment and ourselves. The Parallel Constraint Satisfaction model minimizes dissonance while the Free Energy model minimizes surprise. Both dissonance and surprise can create instability. The Free Energy model is more universal than the Parallel Constraint Satisfaction model, while for decision making PCS is more precise. The Free Energy model also gives us the idea that heuristic models could fit within process models. All this points out what is obvious to us all.  We need the right model for the right job.

Continue reading

The Mixed Instrumental Controller

mic_MG_5849This is more or less a continuation of the previous post based on Andy Clark’s “Embodied Prediction,” in T. Metzinger & J. M. Windt (Eds). Open MIND: 7(T). Frankfurt am Main: MIND Group (2015).   It further weighs in on the issue of changing strategies or changing weights (see post Revisiting Swiss Army Knife or Adaptive Tool Box). Clark has brought to my attention the terms model free and model based which seem to roughly equate to intuition/system 1 and analysis/system 2 respectively. With this translation, I am helped in trying to tie this into ideas like cognitive niches and parallel constraint satisfaction. Clark in a footnote:

Current thinking about switching between model-free and model based strategies places them squarely in the context of hierarchical inference, through the use of “Bayesian parameter averaging”. This essentially associates model-free schemes with simpler (less complex) lower levels of the hierarchy that may, at times, need to be contextualized
by (more complex) higher levels.

As humans, we have been able to use language, our social skills, and our understanding of hierarchy to extend our cognition.  Multiplication of large numbers is an example. We cannot remember enough numbers in our heads so we created a way to do any multiplication on paper or its equivalent if we can learn our multiplication tables. Clark cites the example of the way that learning to perform mental arithmetic has been scaffolded, in some cultures, by the deliberate use of an abacus. Experience with patterns thus made available helps to install appreciation of many complex arithmetical operations and relations. We structure (and repeatedly re-structure) our physical and social environments in ways that make available new knowledge and skills. Prediction-hungry brains, exposed in the course of embodied action to novel patterns of sensory stimulation, may thus acquire forms of knowledge that were genuinely out-of reach prior to such physical-manipulation-based re-tuning of the generative model. Action and perception thus work together to reduce prediction error against the more slowly evolving backdrop of a culturally distributed process that spawns a succession of designed environments whose impact on the development and unfolding of human thought and reason can hardly be overestimated. Continue reading

Embodied(Grounded) prediction(cognition)


clark514DJ8Bec6L._SX329_BO1,204,203,200_This post is based on a paper by Andy Clark: “Embodied Prediction,” in T. Metzinger & J. M. Windt (Eds). Open MIND: 7(T). Frankfurt am Main: MIND Group (2015). Andy Clark is a philosopher at the University of Edinburgh whose tastes trend toward the wild shirt. He is a very well educated philosopher in the brain sciences and a good teacher. The paper seems to put forward some major ideas for decision making even though that is not its focus. Hammond’s idea of the Cognitive Continuum is well accommodated. It also seems quite compatible with Parallel Constraint Satisfaction, but leaves room for Fast and Frugal Heuristics. It seems to provide a way to merge Parallel Constraint Satisfaction and Cognitive Niches. I do not really understand PCS well enough, but it seems potentially to add hierarchy to PCS and make it into a generative model that can introduce fresh constraint satisfaction variables and constraints as new components. If you have not read the post Prediction Machine, you should because the current post skips much background. It is also difficult to distinguish Embodied Prediction and Grounded Cognition. There are likely to be posts that follow on the same general topic.

Continue reading

Recognition heuristic and cognitive effort

2015-04-22 14.27.57Having the good fortune to be lost in Venice, I was reminded of the nuances of the recognition heuristic. My wife found the perfect antique store which was unsurprisingly closed for lunch. We went on, but a couple of hours later we tried to recreate our steps. For much of the journey, we did well having only to recognize that we had seen a particular store or archway or bridge before. Unfortunately, that broke down when we realized that we were retracing our steps in a five minute period. We still remembered that walk to the restaurant the first night, but it was unfortunately not differentiated in our minds. This was certainly an example of less knowledge being more. Eventually, using a combination of GPS and maps we found our way back to our hotel, but we never did find that antique store. And I was trying.

This post is based on a paper by Benjamin Hilbig, Martha Michalkiewicz, Marta Castela, Rudiger Pohl and Edgar Erdfelder: “Whatever the cost? Information integration in memory-based inferences depends on cognitive effort.” that was scheduled to appear in Memory and Cognition 2014. Fundamental uncertainty is a not uncommon situation for our decision making. There is an ongoing argument with the fast and frugal heuristics toolbox approach and the single tool approaches of evidence accumulation and parallel constraint satisfaction. However that argument depends on the particular task, the type of task, and on and on. I am still waiting for a giant table that puts all those things together.

Continue reading

The Fog of the Blog-Parameter P or K?

newspaperindexThis blog has existed for two years. It includes nearly 150 posts. I should note that I remain the only one to have set eyes on a few of them. My most popular post What has Brunswik’s Lens Taught? has only 625 views. Clearly, the blog is all about me. My inability to recall what I have written in the past became obvious even to me recently. My post Parameter P -Slowness Factor (6 views)? is about a month old. My post Parallel Constraint Satisfaction- Plugging in the some of the Numbers (10 views) is a little more than a year old. While writing the Parameter P post the older post did not cross, enter, or even come close to my mind. Interestingly, they seem to concern exactly the same thing, but at different stages. This post will look at that. As for future posts, maybe it is time to look at the existing posts as source material and try to integrate the most significant ideas.

Continue reading