Elsevier

Acta Psychologica

Volume 139, Issue 3, March 2012, Pages 532-542
Acta Psychologica

Decisions beyond boundaries: When more information is processed faster than less

https://doi.org/10.1016/j.actpsy.2012.01.009Get rights and content

Abstract

Bounded rationality models usually converge in claiming that decision time and the amount of computational steps needed to come to a decision are positively correlated. The empirical evidence for this claim is, however, equivocal. We conducted a study that tests this claim by adding and omitting information. We demonstrate that even an increase in information amount can yield a decrease in decision time if the added information increases coherence in the information set. Rather than being influenced by amount of information, decision time systematically increased with decreasing coherence. The results are discussed with reference to a parallel constraint satisfaction approach to decision making, which assumes that information integration is operated in an automatic, holistic manner.

Highlights

► We model decision making with Parallel Constraint Satisfaction Network models. ► We contrast it with a classic bounded rationality perspective. ► The effect of dropping low-valid information from decision tasks is investigated. ► Dropping information increases decision time if it decreases coherence. ► In such situations more information is processed faster than less.

Introduction

In western religion and philosophy, decision making is generally considered the supreme discipline of conscious thought. Free will is exclusively attributed to human beings, manifesting itself in the capability of making choices upon anticipating and waging the consequences of the alternatives. Common language reflects this denotation by defining a decision as a “determination arrived at after consideration” (Merriam-Webster, online). This notion is maintained in the maximization principle of expected utility theory. Accordingly, a rational decision maker should identify the entire set of eligible options, calculate each option's expected utility (EU) and select the one with the highest EU (Savage, 1954, von Neumann and Morgenstern, 1944). Expected utility models do not claim that people indeed calculate weighted sums, but only that their choices can be predicted by such a model (Luce, 2000, Luce and Raiffa, 1957). Simon (1955; see also Veblen, 1898) drew attention to the decision process. He questioned the assumption that people rely on deliberate calculations of weighted sums in decisions, because the limitations of cognitive capacity and the multitude of decision options do not allow them to do so.

Essentially two alternative process model approaches have been suggested. The first approach is based on the idea of adaptive strategy selection. People might use effortful weighted sum calculations only in some situations (Beach and Mitchell, 1978, Gigerenzer and Todd, 1999, Payne et al., 1988, Payne et al., 1993). In other situations, for instance under time pressure, they might rely on short-cut strategies, which consist of stepwise cognitive operations that are (usually) carried out deliberately (cf. Payne et al., 1988; although they also consider a possible implementation as production rules). The considered strategies are usually well specified on a process level; their cognitive costs are measured as the number of necessary calculations for applying the respective strategy, called elementary information processes (EIPs; Newell and Simon, 1972, Payne et al., 1988). Consequently, all adaptive strategy selection approaches converge in assuming that the cognitive effort and the necessary time for a decision increase with the number of processing steps (i.e., EIPs) required by the decision strategy (see also Brandstätter et al., 2006, Bröder and Gaissmaier, 2007, for examples related to the adaptive toolbox approach).1 Hence, response times should increase with an increasing number of information to be processed.

The second approach suggests that people utilize strategies that partially rely on automatic processes (for overviews see Evans, 2008, Gilovich et al., 2002, Glöckner and Witteman, 2010), thereby using the huge computational and storage power of the brain to overcome the obvious limitations of conscious cognitive capacity. Automatic information structuring processes, for instance, are activated in visual perception (McClelland & Rumelhart, 1981) and social perception (Bruner and Goodman, 1947, Read and Miller, 1998) to quickly form reasonable interpretations (i.e., Gestalten; Wertheimer, 1938), which can constitute a basis for judgments and decisions (Betsch and Glöckner, 2010, Glöckner and Betsch, 2008b). Findings indicate that people seem to rely at least partially on such automatic processes in probabilistic inference decisions (e.g., Glöckner and Betsch, 2008c, Glöckner et al., 2010, Hilbig et al., 2010, Horstmann et al., 2009, Simon et al., 2001, Simon et al., 2004) and risky choices (e.g., DeKay et al., 2009a, DeKay et al., 2009b, Glöckner and Betsch, 2008a, Glöckner and Herbold, 2011, Hilbig and Glöckner, 2011).

Several models exist that aim to describe the underlying cognitive processes. According to Glöckner and Witteman (2010), these processes can be categorized into mainly reflex-like associative mechanisms (e.g., Betsch et al., 2004, Finucane et al., 2000), more complex pattern matching mechanisms involving memory prompting (e.g., Dougherty et al., 1999, Fiedler, 1996, Juslin and Persson, 2002, Thomas et al., 2008), automaticity based evidence-accumulation mechanisms (e.g., Busemeyer and Johnson, 2004, Busemeyer and Townsend, 1993, Diederich, 2003), and constructivist mechanisms based on holistic evaluations of the evidence (e.g., Glöckner and Betsch, 2008b, Holyoak and Simon, 1999, Monroe and Read, 2008, Read et al., 1997, Thagard and Millgram, 1995). In the current work, we mainly focus on models for constructivist mechanisms. The other mechanisms are, however, briefly discussed in the final part of this paper. As we will explain in more detail below, these constructivist-automatic processes operate in a holistic fashion and can profit from more information. In contrast to the adaptive strategy selection approaches, it can be predicted that, under certain conditions, more information can be processed more quickly than less. In the study reported in this paper, we test this claim empirically.

In the remainder of the introduction, we will first explain the EIP-based perspective underlying the adaptive strategy selection approach in more detail. Then we will discuss the parallel constraint satisfaction models that can be used to computationally implement holistic processes and we will conclude with reviewing previous evidence concerning the relation between decision time and processing steps.

According to the adaptive strategy selection approach, cognitive effort is measured to quantify the costs of thinking. In this way, decision making processes are decomposed into elementary information processes (EIP, Newell & Simon, 1972). Table 1 shows some EIPs involved in solving decisions from description (Bettman, Johnson, & Payne, 1990).

For example consider the following simple decision problem consisting of a choice between two options based on the four cues presented in Table 2. For illustration purposes, let the options be consumer products and let the cues be consumer testing institutes. The entries in the matrix are evaluations of the product on a relevant criterion dimension, say, its predicted durability, for more (+) or less (−) than 2 years. The cues differ with respect to their cue validity v, each representing the probability that a tester's prediction is correct. The decision task is to select products with the higher durability.

Adaptive strategy selection models assume that the cognitive effort spent on a decision depends primarily on the chosen strategy. A person applying a lexicographic strategy (LEX; Fishburn, 1974; see also Gigerenzer & Goldstein, 1999), for instance, will look up cues in the order of their validity and choose the option that is better on the first differentiating cue. In the example task (Table 2), LEX would require at least 5 EIPs (i.e., 2 READ, 1 MOVE, 1 COMPARE, 1 CHOOSE) to reach a decision.

Now consider a compensatory strategy that processes and integrates all information. The weighted additive (WADD) strategy, underlying utility theory, requires the individual to read all information given, weigh (PRODUCT) the outcome values ([+] = + 1; [−] =  1) with validities and sum up the products for each option (ADD). Finally, the individual must COMPARE the two aggregate values of the two options in order to choose the dominant one. Thus, application of a WADD strategy to the example task requires at least 47 EIPs (16 READ [i.e., 2 × (4 cues and 4 cue validities)], 15 MOVE, 8 PRODUCT, 6 ADD, 1 COMPARE, 1 CHOOSE).

Hence, LEX and WADD strategy differ with regard to number of EIPs; the cognitive effort involved in application should be smaller for LEX compared to WADD. Of course, there are some confinements. Research on the metric of cognitive effort has shown that: (i) different EIPs consume different amounts of cognitive effort (Lohse & Johnson, 1996); (ii) cost differences between strategies are less pronounced in binary decision tasks compared to those involving more options; and (iii) learning reduces the relative costs of a strategy (e.g., Abelson & Levi, 1985, for an overview).2 However, all other things being equal, one can surely assume that an increase in the number of EIPs used by a strategy (i.e., adding further processing steps) should never result in a decrease of cognitive effort.3 One measurable correlate of cognitive effort is decision time (cf. Lohse & Johnson, 1996). Accordingly, processing time should be equal or greater as EIPs increase. The more information we have to consider, compare and integrate, the longer it should take us to arrive at a decision.

At first glance, this claim might be considered a truism. In this paper, however, we question its general validity. Whereas the claim is most likely valid with respect to deliberative processes, it is less likely that it applies to all kinds of automatic processes. Deliberation involves slow, step-by-step consideration of information. It requires conscious control and substantially consumes cognitive resources. Automatic processes, in contrast, operate rapidly and can include a huge amount of information (e.g., Glöckner and Betsch, 2008c, Hammond et al., 1987, Hilbig et al., 2010, Hogarth, 2001). They can work without explicit control and require only a minimum of cognitive resources. Some of these automatic processes implement holistic mechanisms in that information is automatically structured to form Gestalten in a perception-like process (i.e., constructivist mechanisms; see above and Glöckner & Witteman, 2010), which we will henceforth refer to as holistic processing.

Holistic processing has been modeled using parallel constraint satisfaction (PCS) networks (for overviews see Holyoak and Spellman, 1993, Read et al., 1997). PCS networks consist of nodes that represent hypotheses or elements as well as bidirectional links between these representing their relations (Thagard, 1989). Through spreading activation in the network, the best possible interpretation under parallel consideration of all constraints (i.e., links) is constructed. In this process, activation of nodes change, which means that the hypothesis represented by this node is perceived as more or less likely (high vs. low activation).

PCS can be applied to a decision task as described in Table 2 (Glöckner & Betsch, 2008b). Cues and options form the elements (nodes) in a working network. Connections between the elements represent their relations (e.g., cues speaking for or against an option). Relevant information encoded from the environment and related information in (long-term) memory is automatically activated and fed into the working network. Information gained by active search can also be added. The working network represents a subset of knowledge from long-term memory and the environment. It is possible but not necessary that parts of the working network enter conscious awareness.

PCS operates on a subconscious level and is assumed to capitalize on the high computational capacity of automatic processing. According to PCS, decision time should mainly depend on initial coherence in the network (Glöckner & Betsch, 2008b). Coherence is high if all pieces of information in the network fit together well. Coherence is low if there is conflict or inconsistency between elements in the network (cf. Festinger, 1957, Heider, 1958). Consider a network containing two options (cf. Table 2). The more strongly and frequently one option is positively linked to cues compared to the competitor, the clearer the evidence is in favor of this option and the less inconsistency has to be resolved. In such cases coherence is high from the beginning. In contrast, a high degree of conflict in the network (i.e., if cues are equally strong for both options) makes it more difficult to find a coherent solution and, therefore, leads to an increase in decision time. PCS mechanisms are implemented as iterative updating of nodes. Decision time is predicted by the number of iterations until node activations reach an asymptotic level (Freeman and Ambady, 2011, Glöckner, 2009, Glöckner and Betsch, 2008b). Furthermore, according to PCS the option with the higher activation after settling is chosen and the difference in activation between option nodes predicts confidence.

In summary, the PCS-approach predicts that decision time will be a function of coherence in the network. In contrast to the EIP perspective, decision time should be rather independent of the amount of encoded information. Specifically, we predict that (i) decision time will increase if information is removed so that coherence decreases; and (ii) decision time will decrease if information is removed so that coherence increases.

Several results support the hypothesis derived from the EIP perspective. Payne et al. (1988) could show that individuals under time pressure tend to switch to less effortful strategies in situations in which information has to be actively searched using the mouse. Bröder and Gaissmaier (2007) showed that response time increases with the number of computational steps necessary to implement a lexicographic strategy for memory based probabilistic inference tasks. Similarly, Bergert and Nosofsky (2007) found response times which were in line with lexicographic strategies. Finally, in the domain of risky choices Brandstätter et al. (2006) report that decision times increase with the steps necessary to differentiate between gambles using a (semi-)lexicographic strategy for risky choices (i.e., the priority heuristic).

However, there is also evidence showing that this assumed positive relation does not always hold. For probabilistic inference tasks in which information is openly displayed, Glöckner and Betsch (2008c) found a decrease of decision times when comparing tasks for which a lexicographic strategy predicted the opposite. Glöckner and Hodges (2011) qualified the findings by Bröder and Gaissmaier (2007) on memory based decisions by showing that decision times for a substantial portion of participants can be better explained by PCS than by serial heuristics. Ayal and Hochman (2009) attempted to replicate the decision time findings for risky choices by Brandstätter et al. (2006) and found a significant effect in the opposite direction. Similarly, also further investigations of risky choices provided support for the decision time predictions of PCS than for the predictions of the suggested semi-lexicographic strategy (Glöckner and Betsch, 2008a, Glöckner and Herbold, 2011, Glöckner and Pachur, 2012; see also Hilbig, 2008). Also investigations of decision times in probabilistic inferences involving recognition information have shown data more in line with PCS than with strategies assuming stepwise processing such as recognition heuristic (Glöckner and Bröder, 2011, Hilbig and Pohl, 2009, Hochman et al., 2010).

Hence, overall, evidence is equivocal and calls for further investigation. A closer look at papers challenging the EIP perspective reveals one potential weakness in their argument. Specifically, it could be argued that persons might have used another strategy for which predictions were not considered in the analysis (cf. Bröder & Schiffer, 2003a). Since, the number of heuristics is huge and still growing it is often hard to impossible to include all of them in a single model comparison test. We use an improved design to rule out this argument. The basic idea is to manipulate tasks so that all established EIP-based strategies predict a reduction or equal decision time whereas PCS predicts an increase in half of them and a decrease in the other half. Note, however, that we of course cannot rule out that an EIP-based strategy might be developed in the future that can account for our findings. As we will discuss in more detail in Section 4.2, our investigation by necessity has to be limited to the specified parts of adaptive-decision-making approaches.

In the current study we rely on the standard paradigm for investigating probabilistic inference tasks in which persons make decisions based on probabilistic cues. The new idea is to reduce the complexity of tasks by selectively dropping less valid information. For all strategies considering information from all cues (e.g., WADD) this should lead to a reduction in decision time because less information has to be processed. For all lexicographic or elimination strategies (e.g., take-the-best; elimination by aspects; minimalist) dropping cues with low validity should have no influence on decision times if all cues make differentiating predictions.4 The same should be the case for guessing strategies. Hence, for all strategies that we are aware of dropping less valid cues should lead to a reduction of decision time (or should have no influence). As will be discussed below in more detail, dropping can be done so that coherence is increased or decreased which allows realizing the aspired predictions of PCS. A set of prototypical strategies, which were also used in the model comparison reported later, is described in Appendix A.

One important factor influencing decision time is constraints in information acquisition (Glöckner & Betsch, 2008c). If information acquisition is very time-consuming (e.g., each piece of information has to be looked up for 1 min) the prediction of increasing decision time with increasing number of information would be trivial. We are, however, interested in the time needed for information integration and therefore use an open matrix paradigm to minimize constraints to information search.

Section snippets

Method

In repeated decision trials, participants were instructed to select the better of two products (options). They were given information from four testers (cues) with different predictive validity (cue validity), which provided dichotomous quality ratings (good vs. bad) for each product. Following the procedure used in previous studies (e.g., Glöckner & Betsch, 2008c; Exp. 3), information was presented in an “open” matrix (no covered information). The order of cues and options was randomized to

Choices

All participants were able to complete the tasks. No missing values were encountered. The observed proportion of choices for option A are summarized in Table 4. Participants' choices were highly consistent. In all twelve tasks the majority of choices were in favor of option A. In 97% of the repeated choices, participants chose the same option in all five repetitions of the respective variant of a cue pattern, indicating a high choice reliability.

Decision times

Participants followed the instruction to make

Discussion

One cornerstone assumption of the bounded rationality approach states that cognitive capacity is constrained. Building on this assumption, adaptive-decision-making models converge in assuming that humans use a variety of simple decision strategies that, in certain situations, allow them to reduce cognitive costs (Beach and Mitchell, 1978, Gigerenzer and Todd, 1999, Payne et al., 1988). According to these models, cognitive costs and decision time are predicted to increase, ceteris paribus, the

References (92)

  • S. Ayal et al.

    Ignorance or integration: The cognitive processes underlying choice behavior

    Journal of Behavioral Decision Making

    (2009)
  • L.R. Beach et al.

    A contingency model for the selection of decision strategies

    Academy of Management Review

    (1978)
  • L.R. Beach et al.

    Image theory, the unifying perspective

  • F.B. Bergert et al.

    A response-time approach to comparing generalized rational and take-the-best models of decision making

    Journal of Experimental Psychology: Learning, Memory, and Cognition

    (2007)
  • T. Betsch

    Preference theory: An affect-based approach to recurrent decision making

  • T. Betsch et al.

    Intuition in judgment and decision making: Extensive thinking without effort

    Psychological Inquiry

    (2010)
  • E. Brandstätter et al.

    The priority heuristic: Making choices without trade-offs

    Psychological Review

    (2006)
  • A. Bröder et al.

    Sequential processing of cues in memory-based multiattribute decisions

    Psychonomic Bulletin & Review

    (2007)
  • A. Bröder et al.

    Bayesian strategy assessment in multi-attribute decision making

    Journal of Behavioral Decision Making

    (2003)
  • A. Bröder et al.

    Take The Best versus simultaneous feature matching: Probabilistic inferences from memory and effects of reprensentation format

    Journal of Experimental Psychology. General

    (2003)
  • N.R. Brown et al.

    Magnitude comparison revisited: An alternative approach to binary choice under uncertainty

    Psychonomic Bulletin & Review

    (2011)
  • J.S. Bruner et al.

    Value and need as organizing factors in perception

    Journal of Abnormal and Social Psychology

    (1947)
  • J.R. Busemeyer et al.

    Computational models of decision making

  • J.R. Busemeyer et al.

    Decision field theory: A dynamic-cognitive approach to decision making in an uncertain environment

    Psychological Review

    (1993)
  • D. Cartwright et al.

    A quantitative theory of decision

    Psychological Review

    (1943)
  • M.L. DeKay et al.

    Better safe than sorry: Precautionary reasoning and implied dominance in risky decisions

    Journal of Behavioral Decision Making

    (2009)
  • A. Diederich

    Decision making under conflict: Decision time as a measure of conflict strength

    Psychonomic Bulletin & Review

    (2003)
  • M.R.P. Dougherty et al.

    MINERVA-DM: A memory processes model for judgments of likelihood

    Psychological Review

    (1999)
  • J.S.B.T. Evans

    The heuristic–analytic theory of reasoning: Extension and evaluation

    Psychonomic Bulletin & Review

    (2006)
  • J.S.B.T. Evans

    Dual-processing accounts of reasoning, judgment, and social cognition

    Annual Review of Psychology

    (2008)
  • R.H. Fazio

    A practical guide to the use of response latency in social psychological research

  • L. Festinger

    A theory of cognitive dissonance

    (1957)
  • K. Fiedler

    Explaining and simulating judgment biases as an aggregation phenomenon in probabilistic, multiple-cue environments

    Psychological Review

    (1996)
  • M.L. Finucane et al.

    The affect heuristic in judgments of risks and benefits

    Journal of Behavioral Decision Making

    (2000)
  • P.C. Fishburn

    Lexicographic orders, utilities, and decision rules: A survey

    Management Science

    (1974)
  • J.B. Freeman et al.

    A dynamic interactive theory of person construal

    Psychological Review

    (2011)
  • G. Gigerenzer

    Gut feelings: The intelligence of the unconscious

    (2007)
  • G. Gigerenzer et al.

    Betting on one good reason: The take the best heuristic

  • Gigerenzer, G., Todd, P. M., and the ABC Research Group (1999). Simple heuristics that make us smart. Evolution and...
  • T. Gilovich et al.

    Heuristics and biases: The psychology of intuitive judgment

  • A. Glöckner

    Investigating intuitive and deliberate processes statistically: The Multiple-Measure Maximum Likelihood strategy classification method

    Judgment and Decision Making

    (2009)
  • A. Glöckner

    Multiple measure strategy classification: Outcomes, decision times and confidence ratings

  • A. Glöckner et al.

    Modeling option and strategy choices with connectionist networks: Towards an integrative model of automatic and deliberate decision making

    Judgment and Decision Making

    (2008)
  • A. Glöckner et al.

    Multiple-reason decision making based on automatic processing

    Journal of Experimental Psychology: Learning, Memory, and Cognition

    (2008)
  • A. Glöckner et al.

    Accounting for critical evidence while being precise and avoiding the strategy selection problem in a parallel constraint satisfaction approach — A reply to Marewski

    Journal of Behavioral Decision Making

    (2010)
  • A. Glöckner et al.

    The empirical content of theories in judgment and decision making: Shortcomings and remedies

    Judgment and Decision Making

    (2011)
  • Cited by (69)

    • From spontaneous cooperation to spontaneous punishment – Distinguishing the underlying motives driving spontaneous behavior in first and second order public good games

      2018, Organizational Behavior and Human Decision Processes
      Citation Excerpt :

      In addition, prior evidence in various domains has shown that decision conflict, defined as the subjective discriminability of choice options (i.e., difference in utility between choice options or, more broadly, strength-of-preference or differentiation in the phenomenological field; Cartwright & Festinger, 1943), also determines decision time (e.g., Festinger, 1943; Birnbaum & Jou, 1990). Specifically, the more similar decision options are evaluated to be due to similar aspects or cues, the greater the increase in conflict and, hence, decision time (e.g., Fiedler et al., 2013; Glöckner & Betsch, 2012). In the domain of cooperation behavior, Evans, Dillon, and Rand (2015) also show that decision times reflect the extent of decision conflict (see also Krajbich et al., 2015).

    View all citing articles on Scopus
    View full text