Understanding Mental Health and Mental Illness
eBook - ePub

Understanding Mental Health and Mental Illness

An Exploration of the Past, Present, and Future

  1. 338 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Understanding Mental Health and Mental Illness

An Exploration of the Past, Present, and Future

Book details
Book preview
Table of contents
Citations

About This Book

The question of whether someone is psychologically healthy or mentally ill, and the fundamental nature of mental health underlying that question has been debated in cultural, academic, and clinical settings for millennia. This book provides an overview of how people have conceptualized and understood mental illness through the ages.

The book begins by looking at mental illness in humanity's evolutionary past then moves through the major historical epochs: the mythological, the Classical, the Middle Ages, the Renaissance, the Enlightenment, and modern, and the postmodern. At each point, it focuses on major elements that emerged regarding how people judged sanity and insanity and places major emphasis on the growing fields of psychiatry and psychology as they emerged and developed. As the book moves into the twenty-first century, Dr. Jenkins presents his integrated model of knowledge, a systemic, holistic model of the psyche that creates a conceptual foundation for understanding both psychological wellness and disorder and approaching assessment and diagnosis.

This text provides a valuable exploration of mental health and illness across the ages and gives those already well versed in the subject matter a fresh perspective on the past and new model of knowledge and assessment for the future.

Frequently asked questions

Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access Understanding Mental Health and Mental Illness by Paul H. Jenkins in PDF and/or ePUB format, as well as other popular books in Medicine & Psychiatry & Mental Health. We have over one million books available in our catalogue for you to explore.

Information

Publisher
Routledge
Year
2021
ISBN
9780429803277

CHAPTER 1
PREHISTORY

In our review of the history of mental illness, it makes sense to start at the very beginning – that is, at the beginning of the human race. The world did not start with humanity in mind; it came into existence out of a swirling cloud of interstellar debris about 4.5 billion years ago. For the first half a billion years or so, it was molten and volcanic and held little oxygen. However, it was cooling down and, around 4 billion years ago, formed a crust and began to allow liquid water to pool. The first evidence of life emerged 3.5 billion years ago in the form of self-replicating, complex amino acids. Over the next 2 billion years, the world went through the eras of the arthropods, the fish, the amphibians, the reptiles (highlighted by our much-beloved dinosaurs), and finally the mammals. The first recognizable humans emerged about 2 million years ago.
For those who wonder how animals could have sprung so quickly from the primordial swamp, it should be noted that life on Earth remained microscopic for the first 3 billion years. It took another 50 million years to go from the beginning of multicellular life to the Cambrian Explosion, about 540 million years ago.
As previously noted, humans developed about 2 million years ago with the appearance of homo erectus. They then spent the vast majority of the subsequent time hunting and foraging in small groups. About 200,000 years ago, our modern form, homo sapiens, emerged in Africa and began to migrate through Eurasia around 60,000 years ago, replacing all the other homo species. Around 50,000 years ago, early humans began to leave traces of their psychological lives that reflected a conscious, self-reflective mind. Around 30,000 years ago, in the Upper Paleolithic period, humans began leaving evidence, in the form of cave art, of shamanistic religious thinking and practices, suggesting more advanced consciousness, art, mythology, and complexly structured society (Lewis-Williams, 2002).
Ideally, our formal story about mental health should start there, with self-descriptions of the mental states of early humans. The problem with telling this story is that until humans invented writing, about 5,000 years ago, what we have is what archeologists refer to as prehistory. With no written records to refer to, no recordings to study, we rely on clues – a chipped rock here, an ideographic cave painting there – to tell us all that can be known. The good news is that the field of archeological research has developed tremendously over the last 100 years and now provides a surprising amount of evidence, so much more is known about early human history than previously.

EVOLUTION AND EPISTEMIC AUTHORITY

The bad news is that there is a very limited archeological record of anything directly related to mental health or illness. The one exception to this is the practice of trephining, but before the text gets to that story, it is important to take a step back to epistemology again. This chapter has begun with the assertion of a 2.5-million-year history of human existence. That means it starts with a dramatically controversial theory to organize the exploration of its central topic. That theory is evolution. This immediately begs a number of ontological and epistemological questions. Does evolution exist, and how would one know, one way or another? Should this be approached empirically? If so, what natural evidence is required? Basically, to support the validity of the theory, there needs to be evidence that populations change over generations in response to environmental forces. There should be data indicating intermediate species, as well as the emergence of entirely new species. The good news for those who believe in evolution is that there is overwhelming evidence to support the reality of all three of those requirements (change, intermediate species, and new species), thus providing support for evolution as a general and foundational natural process that is very real and the source of human development as a species (Dawkins, 2004; Gould, 1993). This is why it has become the cornerstone of the biological sciences.
The primary competing theory, creationism, has virtually no support in the scientific community (National Academy of Sciences, 1999). There is no evidence, from the research in paleontology, geology, biology, or any other related scientific field, to support the idea that life originated approximately 6,000 years ago and has maintained itself in basically the same form from its beginnings. As a result, creationism is not considered real science. Not only is there no empirical support for creationism, it is not testable by the methods of science and thus is not falsifiable. Instead, it relies on authoritative belief, which cannot be superseded by new data or logical analysis.
From an empirical perspective, evolution is the clear winner in this competition, and this book will move forward on the assumption that evolution, not creationism, is true. However, it is important to point out that the debate between the two models represents a classic competition between two different epistemologies, because the tension between these two approaches to truth is central to the subject matter of this book. The belief in evolution is supported by empiricism, and the belief in creationism is supported by reliance on authority, sometimes called epistemic authority (EA) (Zagzebski, 2012). The first demands observable, naturally occurring data, that leads to a conclusion. The second requires an established authoritative source to provide the truth statement. The authoritative source for creationism is, of course, the Bible and other religious texts, as well as the statements of various religious authorities.
The differing epistemological foundations explain why both beliefs can exist at the same time – the arguments for their truths are completely different and thus unpersuasive to adherents of the other. The question then becomes, is there a basis for picking one epistemology over the other? Can we make an argument for either EA or empiricism being a better way to approach truth? Yes, we can make a strong case that empiricism is a better approach to truth than EA. The main reason is that, while empiricism relies on objective data to guide us logically to a conclusion, EA relies on a judgment of the veracity of the authority itself. This means that all truth statements stemming from authority are circular or tautological arguments and are vulnerable to any refutation of the legitimacy of the authority. Compare these two arguments. Why should I believe in evolution? Because over one hundred and fifty years of fossil (and other) evidence supports the theory, without a single contradictory piece of evidence emerging from any area of research related to the subject. Why should I believe in creationism? Because the Bible says it’s true, and the Bible was written by God. How do you know the Bible was written by God? Because the Bible and religious authority figures say so. Could it have been written by someone else who wanted us to think the text had the authority of God? Yes, but I don’t believe that. Which argument is more persuasive? Unless one is predisposed by religious faith and cultural tradition to believe in the divine authorship theory, the evolutionary argument, based on empiricism is much stronger.
That being said, there is an argument to be made for EA. This may come as a surprise to many who assumed that in the modern age, with the Enlightenment behind us, serious academics would universally reject reliance on authority as a strong or even minimally legitimate epistemological basis for truth. Linda Zagzebski (2012) took on this challenge. She agrees that EA is not taken seriously by modern epistemologists, but she thinks they are misguided. She points out that the rise of empiricism and rejection of EA were largely based on the Enlightenment’s focus on the autonomous self. The more we empower individuals to make their own decisions about what is true, the more we come to distrust authority. The problem, as she describes it, is that this situation leaves us with no way to understand a wide variety of beliefs, including many religious ones that people continue to rely on, based on our trust of authority. She asserts that it is both rational and unavoidable that people have trust in the truth provided by authority figures. She points to self-reflective consciousness, the ability to think about ourselves, as a basis for the legitimacy of EA. The self-reflective person is naturally committed to a belief in authority (I don’t know everything, and there are others who know more than me), and it is impossible to be completely epistemologically self-reliant (I can’t test every possible truth assertion). Thus, we can and need to be able to trust ourselves to assess and trust certain external, authoritative sources of truth.
This is an important point, because the farther we go back in time, the more truth statements about mental illness tended to be based on EA. This is particularly true in the early phases of human history. Before the Enlightenment, beliefs about mental illness were typically not derived from systematic observation or empirical research. They grew out of heuristic reasoning and preexisting cultural beliefs. These beliefs were embraced, communicated, and enforced by authority figures in their respective cultures. These people would have been shamans, elders, chieftains, etc. It was their responsibility to maintain and implement the truths of the group. This dynamic is, of course, still in force today. People rely on cultural authority figures to maintain the integrity of their groups by conserving the truth of the world and carefully managing any change to that truth. As long as the known facts and beliefs are not challenged by new facts and beliefs, they can be maintained with relatively little change for hundreds or even thousands of years.
The emergence of empiricism was the result of just such a clash of new information with preexisting truths. During the late Middle Ages, increasing trade with the Far East and the Middle East brought new ideas to the West. Technological and methodological developments created new information as well. New concepts about reality challenged the existing ones, and the new ones had actual data to back them up. Anyone with access to a telescope could see the moons of Jupiter that proved that not all the heavenly bodies orbited the Earth. At the same time, advances in mathematics and experimental design brought more aspects of reality into the realm of scientific study. By the time Charles Darwin came along in the nineteenth century, the world was used to modern science announcing amazing new ways of understanding reality, and the forces of received wisdom, or EA were increasingly being doubted. Like previous revolutionary ideas such as a heliocentric solar system, the new idea of evolution still produced a great deal of resistance. It still does. However, like other scientific truths, it has overwhelming data to support its validity.
Therefore, this book will proceed on the belief that evolution is ontologically real and epistemologically valid. Further, we will proceed on the belief that evolution is a natural process that led to the emergence and development of the human species. The generally accepted storyline of that development was that humans evolved from small, apelike creatures that probably lived in small bands and were hunter-gatherers. As areas of the African forest began to thin out, the only way those early creatures could survive was by developing a big brain. Being smart was their evolutionary advantage. Cognitive development was excellent for a number of vital functions, including planning and coordination of hunting, food gathering, and various communal activities such as cooking, food distribution, shelter building, defense against predators, etc. As the communal life of these small, slow, but very smart primates became more complex, basic rules of behavioral norms and even collective attitudes and beliefs would naturally develop. Unlike “lower” animals, in which such things are dictated by instinct, in “higher” primates today, many of these rules are learned by youngsters, reinforced by parents and other elders, and can be practiced flexibly within the group, and individuals are punished for breaking them. It would make sense that very early on, individuals who exhibited “strange” or “unusual” attitudes, beliefs, or behavior would cause concern and/or upset in the community, leading to attempts to “help” or “punish” the individual. In rare cases, the individual’s differences from the group might be received as a sign of superior functioning or insight, which could lead to special treatment and status, perhaps as a leader or a shaman. More often, though, nonconformity would have been seen as a problem, especially if it was associated with reduced functioning (poor cooperation skills) or was in distinct contrast to established group beliefs.

TWO TYPES OF EVOLUTION

The concept of evolution is interesting not only because it produces so much controversy but because it is used in so many ways. Classical evolution refers to the natural process by which biological populations genetically and physiologically change in response to their environment. Evolutionary success is measured by survival, failure by death. This is an important point, because there is no such thing as populations getting better and better, there is only survival or death. Evolution is more Machiavellian than humanistic. Organisms that can respond effectively to the challenges and opportunities of their environment thrive. For instance, the dramatic increase in size and change in structure of the brain in early hominids, a process called encephalization, provided the neurological and cognitive engine for behavioral changes that provided significant improvements in survivability. These changes include things like bipedalism, speech, use of symbols, etc. So far, this is a pretty standard understanding of evolution. Colin Renfrew (2007) refers to this as the speciation phase of human development. It reflects the development of basic cognitive capacities tied more or less directly to physiological evolution.
This phase is very slow, taking place over millions of years, but it provided the biological foundation for subsequent cultural changes, which he refers to as the tectonic phase of development. More commonly referred to as social evolution, this phase began to accelerate around 100,000 years ago. It is a much faster process than biological evolution and led to dramatic changes in how humans lived, compared to our (distant) primate cousins in the animal kingdom. While speciation has taken up most of the conversation about human evolution, especially in the public sphere, it is within the tectonic phase that most of the really interesting action has taken place in our evolution. It is the changes in collective social/cultural beliefs, traditions, and subsequent practices that have provided the most dramatic development of characteristically human ways of life and survival advantages. In other words, it is the evolution of culture versus the evolution of the brain itself. That does not mean that the mind, the creator of culture, is a completely separate entity from the brain. Instead, a better analogy, still only partly true, is that the human psyche is like a computer, with the brain being the hardware and culture being the software. The brain does the work, and what we call culture prescribes what work gets done and how it is to be done.
The idea of this two-part evolution has become increasingly accepted in the social sciences. The reason it has become important to the social sciences is that it turns out that the dynamics that drive biological evolution have a parallel reality and impact in the social and cultural spheres (Frank, 1998; Grinin, Markov, & Korotayev, 2013). This is likely because the two foci of evolution have an overlapping variable, the mind. What we call the mind is the activity of the brain and thus directly reflects its functioning. The mind does not exist without the brain, but it is not exactly the same thing as the brain, in the same way the informational output of a computer is impossible without the physical computer but is not the same thing as the computer. Culture represents the collective output of many minds coexisting in a social setting with a sense of mutual identity and thus developing collective beliefs and practices. It cannot exist without the individual minds, but it is not the same thing as individual minds. The three spheres of functioning are inextricably linked and must therefore share some common, naturally occurring dynamics. The most foundational of these is evolution. They are all flexible (changeable) over time, responding to the demands and opportunities of their surrounding environments, but can only do so within the constraints and capabilities of the others.
This brings us back to the concept of systems theory. In this case, we can think of brain > mind > culture as a system. Each is connected to, reliant upon, and, to some extent, overlapping with the others within a holistic, functional system. Trying to understand one without the others is as limited as studying computer design, but not software coding. You get part of the picture but not how the whole computer system works. In regard to our current topic, mental health and mental illness, it is important that we consider not just how the brain is embedded within the mind or how the mind is embedded within culture but how all three are embedded within a natural environment, which itself is always changing. The entire system is thus involved in the process of evolution. The question then becomes, what are the historical roots of the human experience of mental health and mental illness within the context of this unfolding evolutionary process?

COGNITIVE ARCHEOLOGY

This is a very important but very difficult question to answer. There is very little concrete data to answer it. Instead, there are clues and guesses. They lead to theories, which come to us from archeology, specifically cognitive archeology (Mithen, 1996; Renfrew, 2007; Renfrew & Bahn, 2016). This subfield seeks to understand the origins and development of human thinking and symbolic behavior. It focuses on what we can discover about ancient people’s cognition, values, belief systems, and resulting political and social structures, as well as other behavioral dynamics. In other words, it uses material evidence to try and identify symbolic structures and subsequent adaptive behavior. Because very little of it can be linked specifically to mental health or illness, I will provide a brief overview of cognitive archeology, focusing on where linkages to ideas about mental health and mental illness are most appropriate.
The tectonic phase of development, focused on the evolution of the mind and culture, began in earnest around 100,000 years ago, produced a profound emergence of consciousness during the Paleolithic Period, but then went into hyper drive closer to 12,000 years ago. That was when early humans entered what is generally known as the Neolithic Revolution. While it looked different in various places, it typically involved the transition from a hunter-gatherer lifestyle to a more sedentary, village-based life, with agriculture, animal domestication, more sophisticated use of fire, more permanent buildings, production of pottery and art, and the development of more complex religious beliefs and practices. Renfrew and Bahn (2016) point out that sedentism (the practice of living in one place for a long time) was actually a necessary precursor for many of these developments, including intergroup trade, much larger cultural groups, more complex hierarchies and other social groupings, and individual specialization. All of these changes worked together to provide better common defense and food production. This gave the groups who moved in the direction of sedation a distinct advantage not only over the various challenges of the natural environment but over other human groups. More and more over time, human groups who practiced sedentism would enjoy tremendous success and represent an increasing percentage of the human population.
In regard to cognitive changes, the developments around that time likely included more extensive and sophisticated language, use of symbols, and self-consciousness as well as more complex and individually differentiated self-identities based on one’s place in the ec...

Table of contents

  1. Cover
  2. Half Title
  3. Title Page
  4. Copyright Page
  5. Dedication Page
  6. Contents
  7. Introduction
  8. 1 Prehistory
  9. 2 The Mythological Era
  10. 3 The Classical Era
  11. 4 The Middle Ages
  12. 5 The Renaissance
  13. 6 The Enlightenment
  14. 7 The Modern Age
  15. 8 The Postmodern Era
  16. 9 The Twenty-First Century
  17. 10 Now and Into the Future
  18. References
  19. Index