Making Moral Judgments
eBook - ePub

Making Moral Judgments

Psychological Perspectives on Morality, Ethics, and Decision-Making

  1. 200 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Making Moral Judgments

Psychological Perspectives on Morality, Ethics, and Decision-Making

Book details
Book preview
Table of contents
Citations

About This Book

This fascinating new book examines diversity in moral judgements, drawing on recent work in social, personality, and evolutionary psychology, reviewing the factors that influence the moral judgments people make.

Why do reasonable people so often disagree when drawing distinctions between what is morally right and wrong? Even when individuals agree in their moral pronouncements, they may employ different standards, different comparative processes, or entirely disparate criteria in their judgments. Examining the sources of this variety, the author expertly explores morality using ethics position theory, alongside other theoretical perspectives in moral psychology, and shows how it can relate to contemporary social issues from abortion to premarital sex to human rights. Also featuring a chapter on applied contexts, using the theory of ethics positions to gain insights into the moral choices and actions of individuals, groups, and organizations in educational, research, political, medical, and business settings, the book offers answers that apply across individuals, communities, and cultures.

Investigating the relationship between people's personal moral philosophies and their ethical thoughts, emotions, and actions, this is fascinating reading for students and academics from psychology and philosophy and anyone interested in morality and ethics.

Frequently asked questions

Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access Making Moral Judgments by Donelson Forsyth in PDF and/or ePUB format, as well as other popular books in Psychology & Social Psychology. We have over one million books available in our catalogue for you to explore.

Information

Publisher
Routledge
Year
2019
ISBN
9781000710908
Edition
1

1
Judging Morality

The elementary forces in ethics are probably as plural as those of physics are. The various ideals have no common character apart from the fact that they are ideals.
—William James (1897/1979, p. 153)
Moral judgments are the most significant social inferences people make about others and themselves. Those who are judged to be immoral are not just thought to be mistaken or misguided, but unacceptable in a fundamental way: corrupt, untrustworthy, malevolent, and possibly even evil. Moral philosophers’ detailed conceptual analyses of the nature of these judgments, along with psychologists’ more recent empirical studies, suggest that moral judgments are reserved for particularly offensive actions: those that cause harm to others and are inconsistent with standards that, in the given social setting, demarcate the morally good and the morally bad. Yet, despite the critical importance of morality for maintaining stable interpersonal relationships in human societies, disagreement over what is moral and what is immoral is as likely as complete moral consensus. Many factors contribute to this diversity, but among them are differences in each person’s ethics position: Their personal moral philosophy regarding actions that cause others harm (idealism) and their stance with regard to the universality of moral standards (relativism).
***
What will the seven-and-a-half billion people on the planet Earth do today? Some will work, toil at their tasks. Some will relax, vacationing with family and friends. Some will sleep the day away, others will exercise diligently, and some will study, cook dinner, or join with others in shared pursuits. But some will do things that differ from these routine, day-to-day activities. Some will save others’ lives. Some will donate their time to worthy causes. Some will spend another day working to make their community a better place. And then there are the others. The others who commit actions that are socially untoward: From the bigot who insults someone in a despised outgroup, the philandering husband who cheats on his wife of 20 years, the accountant who looks the other way when the boss asks him to obscure the company’s losses, to the thief, the rapist, the molester, and the murderer. On any given day, people will do things that are judged as commendable: good, fair, just, and moral. But they will also do things that earn them moral condemnation: they and their actions will be considered bad, unfair, unjust, and wrong.
These moral judgments, like other types of valuations, range along a continuum from positive to negative. But unlike judgments of a person’s social skills, coordination, conscientiousness, and so on, moral judgments are perceptually and interpersonally persistent and their effects are far-reaching. They are not merely momentary inclinations or personal preferences but socially significant inferences that determine our understanding of ourselves, other people, and our most significant interactions and relationships. Since those who act in ways that others consider to be immoral are often met with negative sanctions, people must be able to predict how others are going to evaluate the things they do if they wish to avoid such sanctions. A pattern of conflict-free interaction implies that we are able to restrict our behaviors so that they do not conflict too greatly with society’s conception of morality, and that those around us are similarly so self-regulated. Moral judgments also make the future seem more predictable, as we expect that those who act ethically today can be trusted to act that way tomorrow. A case could be made that moral judgments are the most significant social inferences people formulate about others and about themselves.
This book examines the making of moral judgments, but with a focus on one puzzling aspect of these judgments: their diversity. Philosophers have been examining matters of morality for thousands of years, yet they continue to disagree when discussing what makes something morally right rather than wrong. Socrates believed that morality and wisdom are so closely associated that virtuous action flows effortlessly from knowledge, but Aristotle demurred by suggesting virtue is manifested in one’s actions. Hume made the case that morality is more a matter of emotion and sentiment than reason and rationality but Bentham considered morality to be a question of utility: Does the action promote or interfere with happiness? Kant, in contrast to all, believed that intentions separated out the good and bad, for good will matters absolutely, whereas good effects count for nothing when it comes to morality (MacIntyre, 2003).
These divergences in moral conceptions are not unique to philosophers. Humans have a tendency to drift toward conformity and agreement, yet any two people’s moral appraisals of the very same act in clearly defined circumstances can spin off into different directions. Certainly, some actions receive nearly universal commendation and condemnation—altruistic, self-sacrificing acts, for example, or actions that are done to intentionally harm innocents—but this consensus is lost when the discussion turns to less clear-cut issues. The person who dismisses a small harm done to achieve greater good runs afoul of the person who condemns anyone who causes suffering. Some are certain that lies that serve a positive purpose—white lies—are ethically allowed, but others say any and all lies are immoral. For every person who publicly announces a moral claim about some contemporary social issue, such as abortion, gay marriage, and universal health care, is another person who takes an opposing view. Even when individuals agree in their moral pronouncements, they may employ different standards, different comparative processes, or entirely disparate criteria in their judgments. Given that moral judgments significantly influence our perceptions of one another, our choices in morally charged situations, and the interpersonal processes that sustain adaptive, healthy social relationships, this diversity in moral thought is puzzling.
This book reviews the factors that influence the moral judgments people make, with a particular emphasis on the impact of individual differences in ethical ideologies people adopt on their inferences about morality. This chapter introduces that analysis by first examining the defining features of moral judgments and their implications: What distinguishes such judgments from the many other inferences about people’s traits, tendencies, strengths, and weaknesses? And what psychological and interpersonal purposes do these judgments serve?

Moral Inferences

We are all psychologists of a sort, for whenever we encounter other people we set to work deciphering them. We do not passively observe those around us, but instead actively scrutinize others’ actions, drawing inferences about their dispositional tendencies, their preferences and attitudes, and their intentions and designs. When we meet other people we intuitively gather the data we need to make these inferences: We appraise their appearance, their gestures, their words, and their actions. As Heider (1958, p. 2) stated in his classic work, The Psychology of Interpersonal Relations: “the ordinary person has a great and profound understanding of himself and of other people which, though unformulated and only vaguely conceived, enables him to interact with others in more or less adaptive ways.”
Many of these inferences about other people pertain to their basic traits, skills, competencies, moods, interests, and values. But some go deeper; they speak not to surface level, transitory attributes of the individual, but something more basic, more fundamental. Listening to a friend explain how she impressed her boss at work by misleading him about the quality of her work product, we may conclude she is clever, resourceful, and successful, but that she is also a person who cannot be trusted to always tell the truth. When we hear about a firefighter who intervened to save a child stranded in a burning building we may admire his courage and dedication, but also perceptually promote him into a select group of those we admire for their distinctive moral pureness: the morally advanced exemplars. When we see parents spank their misbehaving child at the mall we not only draw inferences about their parenting skills and their control of their tempers, but we may also question their ethics; is it ever justified to physically harm a defenseless minor who you are charged to nurture and protect?
These construals are more than detached descriptions of individuals’ qualities and the actions they performed. They are moral judgments: evaluative appraisals of the goodness, rightness, and propriety of individuals and their actions. These judgments are not just evaluative, but profoundly evaluative. An immoral person is not just objectionable or unsavory, but wicked or evil, and a person judged to be moral is not just nice or fun to be with, but saintly or virtuous. Moral judgments also tend to be more definitive than opinions, preferences, or other more circumspect inferences—people are generally quite confident when they express their conclusions about morality—even though these judgments are often systematically biased ones. Morality is, more often than not, in the eye of the beholder.

Moral Judgments Are Profoundly Evaluative

To be considered inept, good natured, inconsiderate, wise, or lazy is one thing, but these perceptual inferences pale in their social and psychological impact in comparison to judgments of ethicality. Moral judgments are not tepid, wishywashy appraisals, but strongly valenced pronouncements of worth and approval or condemnation and disapproval. Words associated with morality are uniquely evaluative, as Anderson (1968) discovered in his analysis of 555 words that people use to describe other people. When he asked 100 people to rate the words on a scale from “least favorable or desirable” to “most favorable or desirable,” words pertaining to morality tended to cluster at the extremes. Such qualities as mature, warm, earnest, kind, friendly, happy, and clean were rated positively, but significantly lower that words that signaled morality: sincere, honest, loyal, truthful, honorable, and trustworthy. Conversely, negative, socially objectionable qualities, including self-conceit, hard-hearted, prejudiced, irresponsible, unpleasant, impolite, and crude were rated very negatively, but not as negatively as the words on the list that signaled immorality: insincere, unkind, untrustworthy, deceitful, dishonorable, malicious, untruthful, dishonest, phony, and liar. Anderson also asked respondents to rate each word for “meaningfulness.” Positive, negative, and relatively neutral words (e.g., cautious, innocent, inoffensive, nonchalant, self-contented) were rated as similar in meaningfulness. Words indicating morality and immorality, in contrast, were rated as significantly more meaningful compared to the more neutral words. Variance in the ratings of the words was also significantly less for moral and immoral attributes, relative to negative and neutral qualities.1
Morally good and bad actions may garner more extreme appraisals because they are relatively unusual and so they violate people’s expectations. As expectancy-theory suggests, characteristics or actions that perceivers’ consider to be highly unusual generate, in most cases, a more extreme evaluation (Skowronski & Carlston, 1989). Although moral behaviors such as honesty, self-sacrifice, and compassion are socially desired qualities, they are more rarely observed than more common qualities such as friendliness, self-indulgence, and impatience. Those who are unfailingly truthful or act to help others violate base rates, and so their salient and unexpected acts trigger a more extreme (and positive) evaluation. In contrast, actions that are roundly condemned if identified are, fortunately, also rarer than more quotidian types of activities. These negative but unexpected qualities thus trigger an extreme evaluation, albeit one that is negative rather than positive (Mende-Siedlecki, Baron, & Todorov, 2013).

Moral Judgments Are Inferences

The word judgment is usually applied to people’s appraisals of morality, suggesting that these construals are different in some way from other types of interpersonal inferences and appraisals. In everyday talk people do not say they estimate, perceive, take in, or appreciate another person’s moral goodness or badness: They judge that person’s morality. Calling these psychological assessments judgments suggests that they have more in common with a magistrate’s objective ruling or decree rather than a person’s idiosyncratic opinion or preference. Moral judgments, more so than other inferences, are thought to be transpersonal; it makes no matter who the individuals involved are, the judgment should apply across persons. Moreover, as judgments rather than opinions or estimates, they are often considered to be more matters of fact rather than matters of personal preference. As Smith (2011), in his analysis of the relationship between dehumanization and collective aggression explains, “When a person sincerely judges that an act is morally wrong, this entails that they want to avoid it, and that they believe everyone else should avoid it, too” (p. 219).
As with a judge’s decision, individuals often express their moral judgments with a relatively high degree of definitiveness. Those who dislike the color beige likely recognize that this preference is a matter of taste. But those who consider an action such as abortion, cheating, or stealing to be morally wrong are less likely to feel these pronouncements are a matter of opinion (Skitka, 2010). As the philosopher Frank Chapman Sharp (1898, p. 201) writes: “From the uniformity and immediacy of the moral judgment follows directly its certainty, the sense of necessity, untroubled by a single doubt.”
The word judgment also suggests that people’s inferences about morality are based on their rational review of all available evidence. Moral philosophers such as Socrates, Kant, and Bentham argued over most aspects of morality, but they generally agreed that people make moral judgments through rational reflection. Socrates, for example, reduced moral “virtues to knowledge and did away with the non-rational part of the soul, feelings, and character” (Irwin, 1995, p. 9). Kant concluded: “The pre-eminent good which we call moral can therefore consist in nothing else than the conception of law in itself, which certainly is only possible in a rational being” (1788/2014). And Bentham’s felicific calculus requires considerable cognitive bookkeeping, for one must carefully estimate the nature of the pain and pleasure an action will likely produce (e.g., intensity, duration, purity) and then “take the balance; which, if on the side of pleasure, will give the general good tendency of the act, if on the side of pain, the general evil tendency” (Bentham, 1789/1948, p. 31).
Many psychologists, too, assume moral judgments are guided by the same basic psychological processes that determine decision making in general. Dewey (1922, p. 207), in his analysis of character and conduct, maintained that “the moral is to develop conscientiousness, ability to judge the significance of what we are doing and to use that judgment in directing what we do … by fostering those impulses and habits which experience has shown to make us sensitive, generous, imaginative, impartial in perceiving the tendency of our inchoate dawning activities.” Kohlberg (1958), too, underscored the cognitive foundations of morality when he proposed that “moral action is oriented to or preceded by a value judgment… . this distinction does not mean that moral action is motivated by pure reason as Kant thought, but the need to see moral action as determined by reason seems to spring from the experience of moral judgments as motivating” (pp. 8–9). Turiel and his colleagues, in their studies of developmental changes in moral judgment, concluded that older children “form distinct, organized systems of thought” which subsequently guide the processing of information about moral and conventional actions: “features of events are processed and interpreted by individuals from the perspective of their own domain-differentiated judgment” (Turiel, Hildebrandt, Wainryb, & Saltzstein, 1991, pp. 5–7). He concludes “the substantive aspects of morality” are “connected with judgment, thought, and reflections” (Turiel, 2018, p. 9).
But moral judgments are not entirely rational conclusions reached through dispassionate review of all the available information. People, when making decisions, sometimes rely on simplifying cognitive heuristics that can cause them to reach erroneous conclusions, and evidence indicates that moral judgments are not immune to the biasing effects of these heuristics (e.g., Sunstein, 2005). Moral judgments are also influenced, to a degree, by the same types of biases that influence other inferences, such as primacy effects, framing, and hindsight. Imagine, for example, people learn about a person’s intentions either before or after they are told that the person acted in ways that caused harmed to others. The sequencing of the information should not influence judgments, but it does: The impact of information about intentions is greater when presented after, rather than before, the description of the harm that was done (Leloup, Meert, & Samson, 2018). People’s moral judgments are also influenced by how an action is described or how a moral choice is framed. For example, individuals respond differently when asked to consider a difficult moral choice, such as treating 100 patients with an experimental drug that will save some patients, but kill others due to the treatment’s severe side effects. Individuals will tend to approve the use of the drug if told “it will save the lives of 80 patients,” but reject the use of the drug if it will “kill 20 patients” (Sinnott-Armstrong, 2008). The hindsight bias also distorts people’s moral inferences, for once we know if an action resulted in some negative consequences, we judge the action as less moral—even though the negative outcomes were not intended or foreseeable (Fleischhut, Meder, & Gigerenzer, 2017).
Moral judgments, like other inferences about people and the things they do, are also often sustained as much by emotion as they are by the dispassionate review of all available information. Haidt’s (2001) social intuitionist model of moral judgment, for example, suggests that a quick, emotional intuition or “gut feeling” often guides people’s moral judgments, and that these emotional reactions may prompt them to make moral decisions that are not entirely consistent with reason. Only after the judgment is made does cognition’s role become activated as a post hoc justification tool.
Our judgments of o...

Table of contents

  1. Cover
  2. Half Title
  3. Title
  4. Copyright
  5. Dedication
  6. CONTENTS
  7. Acknowledgments
  8. 1 Judging Morality
  9. 2 Ethics Position Theory
  10. 3 Measured Morality
  11. 4 Individuals Differ
  12. 5 Moral Thought
  13. 6 Moral Behaviors and Emotions
  14. 7 The Geography of Ethics
  15. 8 Ethics in Context
  16. Appendix: The Ethics Position Questionnaire (EPQ-5)
  17. References
  18. Name Index
  19. Subject Index