Hypothetical Thinking
eBook - ePub

Hypothetical Thinking

Dual Processes in Reasoning and Judgement

  1. 206 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Hypothetical Thinking

Dual Processes in Reasoning and Judgement

Book details
Book preview
Table of contents
Citations

About This Book

Hypothetical thought involves the imagination of possibilities and the exploration of their consequences by a process of mental simulation. In this Classic Edition, Jonathan St B. T. Evans presents his pioneering hypothetical thinking theory; an integrated theoretical account of a wide range of psychological studies on hypothesis testing, reasoning, judgement and decision making.

Hypothetical thinking theory is built on three key principles and implemented in a version of Evans' well-known heuristicā€“analytic theory of reasoning. The central claim of this book is that this theory can provide an integrated account of apparently diverse phenomena including confirmation bias in hypothesis testing, acceptance of fallacies in deductive reasoning, belief biases in reasoning and judgement, biases of statistical judgement and numerous characteristic findings in the study of decision making.

Featuring a reflective and insightful new introduction to the book, this Classic Edition discusses contemporary theory on cognitive biases, human rationality and dual-process theories of higher cognition. It will be of great interest to researchers, post graduates as well as advanced undergraduate students.

Frequently asked questions

Simply head over to the account section in settings and click on ā€œCancel Subscriptionā€ - itā€™s as simple as that. After you cancel, your membership will stay active for the remainder of the time youā€™ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlegoā€™s features. The only differences are the price and subscription period: With the annual plan youā€™ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, weā€™ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access Hypothetical Thinking by Jonathan St B. T. Evans in PDF and/or ePUB format, as well as other popular books in Psychology & History & Theory in Psychology. We have over one million books available in our catalogue for you to explore.

Information

Year
2019
ISBN
9781000768688
Edition
1

CHAPTER ONE

Introduction and theoretical framework

It is evident that the human species is highly intelligent and well adapted. Some of our intelligence we clearly share with many other animals: we have well-developed visual and other perceptual systems, complex motor skills and the ability to learn in many ways to adapt to the environment around us. We also seem to be smart in ways that other creatures are not: we have a language system that is complex and sophisticated in its ability both to represent knowledge and to communicate with other humans; we study and attempt to understand a multitude of subjects including our own history and that of the universe; we have devised systems of mathematics and logic; we design and build a huge range of structures and artifacts; we have constructed and mostly live our lives within highly complex economic and social structures. All of these distinctively human things imply an extraordinary ability to reason, entertain hypotheses and make decisions based upon complex mental simulations of future possibilities. I will use the term ā€œhypothetical thinkingā€ as a catch-all phrase for thought of this kind.
It is equally apparent that evidence of human error and fallibility surrounds us. The world is plagued by wars, famines and diseases that in many cases appear preventable. Stock markets collapse under panic selling when each individual acts to bring about the outcome that none of them wants. Doctors sometimes make disastrous misjudgements that result in the disability or death of their patients. Experts often fail to agree with each other and may be shown in hindsight to have made judgements that were both mistaken and overconfident. At the present time, governments of the world are well informed about the likely progress of global warming and its consequences but seem to be making minimal progress in doing anything to prevent it. Criminal courts continue to convict the innocent and acquit the guilty, with alarming regularity. And so on, and so forth.
It seems vital that psychologists should be able to provide understanding of the mental processes of reasoning and judgements that underlie the actions and decisions that people take. A fundamental premise of the current book is that there are two distinct kinds of thought, which for the moment I will call intuitive and deliberative. Many of our everyday decisions are made rapidly and intuitively because they just feel right. Others are made much more slowly, involving conscious deliberative thinking. Sometimes we have no time for deliberative thought and just have to react quickly to some situation. In fact, the great bulk of our everyday cognitive processing is carried out rapidly and implicitly without conscious thought. Such processes enable us to accomplish a multitude of necessary tasks, as, for example, when we recognize a face, extract the meaning from a sentence, keep our car safely on the road when driving to work (and thinking consciously about something quite different) or attend to the voice of one person in a room containing the babble of many conversations.
Much of our judgement and decision making takes place at this level also. A lot of our behaviour is habitual, so we are not conscious of choosing our direction at a junction on a familiar drive to work. However, something very different happens when we drive to a new location in an unfamiliar town, following verbal directions or trying to read a map. Now we have to engage conscious and deliberative thinking and reasoning to work out the route, identify landmarks, turn at the correct places and so on. In general, novel problems require much more deliberative thought than do familiar ones. When we have to do this kind of thinking it takes time, it requires effort and it prevents us from thinking about other things. Conscious, deliberative thinking is a singular resource that can only be applied to one task at a time. This is one reason that we allocate this kind of thought to tasks and decisions that have great importance for us and make snap intuitive decisions about less important things. However, there is no guarantee that thinking about our decisions will necessarily improve them (see Chapter 5).
Folk psychology - the common-sense beliefs that we all hold about our own behaviour and that of our fellow human beings - involves the idea that we are consciously in control of our own behaviour - we think, therefore we do. The opinion polling industry, for example, is built on the common-sense belief that people have conscious reasons for their actions which they can accurately report. Psychological research, however, seriously undermines this idea (Wilson, 2002). Not only is much of our behaviour unconsciously controlled, but many of our introspections provide us with unreliable information about the extent and the ways in which our conscious thinking controls our actions. Working out the relative influence of intuitive and deliberative thinking and the interaction between the two systems is a complex problem that must be addressed with the methods of experimental psychology. This enterprise lies at the heart of the current book.
Many of the phenomena to be discussed in this book are described as cognitive biases. It may appear that the demonstration of bias implies evidence for irrationality, and it is impossible to study these topics without taking some view on whether and in what way people are rational. Cognitive psychology as a whole studies the workings of the mind at a number of levels. Basic cognitive processes (still incredibly complex and sophisticated) form the building blocks for our behaviour and thought. These include such functions as pattern recognition, language comprehension, memory for events and the acquisition of conceptual knowledge about the world around us. None of these topics has generated debate about human rationality. Our visual systems have limited acuity and our memory systems limited capacity, we assume, because that is simply the way our brains are designed: the way they were shaped by evolution to be. The study of higher cognitive processes, on the other hand - thinking, reasoning, decision making and social cognition -has been somewhat obsessed by the notions of bias, error and irrationality. Author after author provide us with evidence of ā€œbadā€ thinking: illogical reasoning, inconsequential decision making, prejudice and stereotyping in our view of people in the social world. The study of cognitive biases is something of a major industry.
What exactly is a cognitive bias? One definition is that it is systematic (not random) error of some kind. This then begs the question of what is an error. Psychologists have largely answered the second question by reference to normative systems. Thus reasoning is judged by formal logic; judgement under uncertainty by probability theory; choice behaviour by formal decision theory and so on. Some authors go further and claim that people who fail to conform to such normative standards are irrational. Most of the biases studied in cognitive psychology have been defined in this way, and yet this notion is today highly controversial. Some authors claim that peopleā€™s behaviour only appears biased or irrational because the wrong normative theory is being applied. For example, if standard logic requires that propositions are clearly true or false, then peopleā€™s reasoning in an uncertain world might better be assessed by norms based on probability theory (see Oaksford & Chater, 2001).
In fact, we do not necessarily need to invoke normative rationality in order to think about cognitive biases. We have much lower visual acuity than does a bird of prey, but vision researchers do not accuse us of being biased against distant objects. Similarly, memory researchers do not accuse us of irrationality if we cannot remember a phone number more than seven or eight digits in length. Researchers in this area rarely use the term ā€œbiasā€, but their findings certainly indicate the constraints and limitations of human information processing. So we could think about biases of thought and judgement also as indicators of the design limitations of the brain. This is an approach that emphasizes what is known as bounded rationality (Simon, 1982). According to this view, we are not inherently irrational but we are cognitively constrained in the way we can reason about the world. For example, it may not be possible to calculate the best choice of action in a given situation, so we settle for one that is good enough.
Another concept of cognitive bias is dispositional: for example, people have different styles of thinking that may be related to personality or to culture. A widely cited claim is that Western people have a more analytic style of thinking, while Eastern people are more holistic or intuitive (Nisbett, Peng, Choi, & Norenzayan, 2001). One style is not necessarily better than the other, but each may fare better or worse on different kinds of task. Combining the dispositional and bounded rationality approaches, we might conclude that peopleā€™s ability to think in particular ways is biased or constrained not only biologically, in the design of our brains, but also culturally. Either or both kinds of explanation might be induced, for example, to account for biases in social cognition. For example, people seem compulsively to employ stereotypes when thinking about people from an ā€œout-groupā€ with whom they do not share social membership (Hinton, 2000). This could reflect some innate form of social intelligence shaped by evolution, learning of cultural norms passed from one generation to the next, or an interaction of the two.
As we shall see in this book, psychologists studying higher cognitive processes have discovered and documented a wide range of biases. In most cases, these biases have been defined as deviation from a normative standard, leading to a debate about whether or not they should be termed irrational. I have discussed the rationality issue in detail elsewhere (Evans & Over, 1996a), and it will not be the main focus of interest in this book. (I will, however, consider the issue in my final chapter.) The study of cognitive biases should be seen as important for two reasons, whether or not they are deemed to provide evidence of irrationality. First, they establish the phenomena that have to be explained. Second, they may have practical implications for reasoning and decision making in the everyday world. Hence, each bias gives rise both to a theoretical question: ā€œWhy do people think in this way?ā€ and to a practical question: ā€œHow will this bias manifest itself in real-world behaviour and with what consequences?ā€
As an example, psychologists have accumulated much evidence that peopleā€™s evaluation of logical arguments is biased by whether or not they believe the conclusions given (Chapter 4). This is regarded as a bias because logical validity depends only on whether a conclusion follows necessarily from some assumptions and not on whether assumptions or conclusion are actually true. I suppose one could try to move directly from this result to its practical implications without any real theoretical analysis of the cause of the bias. Such an analysis might, however, conclude that human reasoning is automatically contextualized by prior knowledge and belief and that only a strong effort of deliberative conscious reasoning will overcome this. In my view, understanding of the likely practical implications of the bias is greatly assisted by this kind of theorizing.
In this book, I shall be viewing the phenomena discussed within both a broad and a more specific theoretical framework to be introduced later in this chapter. The broad framework, generally known as ā€œdual-processā€ theory, has been applied to a wide range of cognitive studies, including learning (Reber, 1993), reasoning (Evans & Over, 1996a; Stanovich, 1999), conceptual thinking (Sloman, 1996), decision making (Kahneman & Frederick, 2002) and social cognition (Chaiken & Trope, 1999). Dual-processing approaches assert the existence of two kinds of mental processes corresponding broadly to the idea of intuitive and deliberative thinking and to more general distinctions between implicit and explicit cognitive processes, such as those involved in learning and memory. Within this general framework, I will, however, present a more specific dual-process theory of hypothetical thinking that updates and extends my earlier heuristic-analytic theory of reasoning (Evans, 1989). In support of this theory, I will discuss phenomena that are drawn mostly (but not exclusively) from two separate but related literatures: the psychology of reasoning on the one hand; and the study of judgement and decision making on the other. Before presenting my general and specific theoretical framework, I shall outline the nature of these two fields of study, including the methods and theoretical approaches that have tended to dominate them.

The Psychology of Reasoning

I ought, perhaps, to start with the distinction between implicit and explicit inference (see also Johnson-Laird, 1983). Any kind of inference involves going beyond the information given and may technically be regarded as deductive or inductive. Inductive inferences add new information, whereas deductive inferences draw out only what was implicit in assumptions or premises. Both deductive and inductive inferences may be either implicit or explicit in terms of cognitive processing. I shall illustrate this with some examples.
Pragmatic inferences are almost always involved in the comprehension of linguistic statements (see Sperber & Wilson, 1995, for discussion of many examples). Because they typically add information from prior knowledge relevant to the context, they are generally inductive as well as implicit. As a result, such inferences are plausible or probable but not logically sound and may turn out to be incorrect. In accordance with the communicative principle of relevance (Sperber & Wilson, 1995), every utterance conveys a guarantee of its own relevance, and this licenses many pragmatic inferences. Consider the following dialogue between an adult son and his mother:
  • ā€œI think I am going to be late for workā€
  • ā€œMy car keys are in the usual placeā€
  • ā€œThanks, Mumā€.
There will be a context behind this exchange that is mutually manifest to both parties. For example, the son usually travels to work but his mother sometimes lets him borrow her car, which takes 15 minutes off the journey. Hence, the first statement is interpreted as a ...

Table of contents

  1. Cover
  2. Half Title
  3. Series Page
  4. Title Page
  5. Copyright Page
  6. Table of Contents
  7. Foreword to the classic edition
  8. Foreword and acknowledgements
  9. 1 Introduction and theoretical framework
  10. 2 Hypothesis testing
  11. 3 Suppositional reasoning: if and or
  12. 4 The role of knowledge and belief in reasoning
  13. 5 Dual processes in judgement and decision making
  14. 6 Thinking about chance and probability
  15. 7 Broader issues
  16. References
  17. Author index
  18. Subject index