Being Brains
eBook - ePub

Being Brains

Making the Cerebral Subject

  1. 328 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Being Brains

Making the Cerebral Subject

Book details
Book preview
Table of contents
Citations

About This Book

Being Brains offers a critical exploration of one of the most influential and pervasive contemporary beliefs: "We are our brains." Starting in the "Decade of the Brain" of the 1990s, "neurocentrism" became widespread in most Western and many non-Western societies. Formidable advances, especially in neuroimaging, have bolstered this "neurocentrism" in the eyes of the public and political authorities, helping to justify increased funding for the brain sciences.

The human sciences have also taken the "neural turn," and subspecialties in fields such as anthropology, aesthetics, education, history, law, sociology, and theology have grown and professionalized at record speed. At the same time, the development of dubious but successful commercial enterprises such as "neuromarketing and "neurobics" have emerged to take advantage of the heightened sensitivity to all things neuro. Skeptics have only recently begun to react to the hype, invoking warnings of neuromythology, neurotrash, neuromania, and neuromadness.

While this neurocentric view of human subjectivity is neither hegemonic nor monolithic, it embodies a powerful ideology that is at the heart of some of today's most important philosophical, ethical, scientific, and political debates. Being Brains critically explores the internal logic of such ideology, its genealogy, and its main contemporary incarnations.

Frequently asked questions

Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access Being Brains by Fernando Vidal, Francisco Ortega in PDF and/or ePUB format, as well as other popular books in Biological Sciences & Neuroscience. We have over one million books available in our catalogue for you to explore.

Information

ONE
Genealogy of the Cerebral Subject
What “Is” the Cerebral Subject?
It may well be that nobody believes they literally are their brain. But when influential people proclaim it, we must take them at their word. Together with the brain in a vat, brain transplantation is one of the favorite thought experiments of philosophers of personal identity (Ferret 1993).1 It is usual to observe that if the brain of A were transplanted into the body of B, then A would gain a new body, rather than B a new brain. Commenting on that commonplace, Michael Gazzaniga (2005, 31), a leading neuroscientist, serenely asserted: “This simple fact makes it clear that you are your brain.” Yet what we have here is neither a fact nor anything simple; it is a profession of faith. The neurophilosopher Paul Churchland “carries in his wallet a colour picture of his wife. Nothing surprising in it,” remarks the sociologist Bruno Latour, “except it is the colour scan of his wife’s brain! Not only that,” he continues, “but Paul insists adamantly that in a few years we will all be recognizing the inner shapes of the brain structure with a more loving gaze than noses, skins and eyes!” (Latour 2004, 224). Gazzaniga, Churchland, and many others who make similar claims express a widespread belief.2 So widespread indeed, that saying, as the New York Times cultural commentator David Brooks did in June 2013, that “the brain is not the mind” immediately generates a flutter of suspicion about a religious and antiquated—even reactionary—dualistic antineuroscience backlash as well as self-confident reassertions of the assumption that “the mind is what the brain does” (Brooks 2013, Marcus 2013, Waldman 2013). The examples could be multiplied.
What is at stake here? Neither science nor ascertainable facts but an idea of the human being, the anthropological figure of the cerebral subject—an “ideology” in the plain sense of a set of notions, beliefs, values, interests, and ideals. Like any ideology, this one offers varieties and internal debates and inspires practices that are not necessarily compatible. Yet there is unity in diversity, so that the cerebral subject allows for a fairly unequivocal characterization, and even for a sort of formula: “Person P is identical with person P* if and only if P and P* have one and the same functional brain” (Ferret 1993, 79).3 To have the same brain is to be the same person, and the brain is the only part of the body we need in order to be ourselves. As the philosopher Roland Puccetti (1969, 70) memorably put it: “Where goes a brain, there goes a person.” Puccetti was not saying that a person is his or her brain but that insofar as the brain is the physical basis of personhood, one cannot be separated from the other. The brain is the somatic limit of the self, so that, as regards the body they need to be persons, humans are specified by the property of “brainhood” (Vidal 2009a), that is, the property or quality of being, rather than simply having, a brain.
Now we must go beyond definitions and ask, first, if there are any real, concrete cerebral subjects and, second, which magnitude (from hegemonic to inconsequential) the brainhood ideology may actually be said to have. In a first approximation, there is one answer to both questions, and it is: It depends. Yes, real people can see themselves as cerebral subjects and behave accordingly—but not necessarily all the time. The weight of the ideology depends on contexts and criteria.
The reason for thinking in terms of a “subject” is that views about what humans essentially are go hand in hand with concrete decisions about how to study them and how to treat them, and these decisions implicate processes of “subjectivation” (Foucault 1983; “subjectification” is sometimes also used). These are processes involved in the production of ways of being, in forms of reflexivity and “technologies of the self” (Foucault 1988); they make individuals what they are and contribute to shape their behavior and experience. In our case, then, they are processes whereby people think of themselves and others as primarily determined by their brains—and act, feel, and believe accordingly.4 Individuation and subjectivation are rooted in sociohistorical contexts and, as we shall see, do not exclude the coexistence of different anthropological figures: cerebral selves, psychological selves, chemical selves, and others.
At the individual level, cerebral subject is not a label that can be permanently affixed to anyone but is rather a way of denoting notions and practices that may be operative in people’s lives some of the time. In practice, no one conception of the human is monolithic or hegemonic in a given culture, and persons are not one kind of subject alone. For example, the developmental biologist Scott F. Gilbert (1995) contrasted four biological views of the body/self—the neural, immunological, genetic, and phenotypic—and put them in correspondence with different models of the body politic and different views of science. He thus highlighted how political debates mirror disputes over which body, and consequently which self, are the true body and self. “Immune selfhood” has a very rich history of its own (Tauber 2012), but writing in the mid-1990s, Gilbert noted that the genetic self had been recently winning over the other selves. These may be theoretical constructs, but they have real consequences. Thus, as Gilbert points out, in controversies over abortion, the self may be defined genetically (by the fusion of nuclei at conception), neurally (by the onset of the electroencephalographic pattern or some other neurodevelopmental criterion), or immunologically (by the separation of mother and child at birth). In each case, when affected by concrete medical decisions, individuals accomplish the “self” whose definitional criteria were used to reach the decisions.
Thus, it makes sense to refer to a “genetic self” when people’s life and self-concept are largely defined by genetic conditions or by genetic testing, screening, and treatment (e.g., Peters, Djurdjinovic, and Baker 1999). Individuals are unlikely to reduce themselves and others to their genetic makeup. However, scientific authorities may suggest such a reduction in statements epitomizing beliefs that permeate a research field, inspire its quest, legitimize its promises, nourish expectations, and orient policy. This was the case when James D. Watson, the codiscoverer of the structure of DNA, uttered for Time an assertion that has been quoted hundreds of times: “We used to think our fate is in our stars. Today we know, in large measure, our fate is in our genes” (Jaroff 1989). The oracular claim was supposed to be universally valid, independently of particular individuals’ sense of self. By the time the Human Genome Project was completed in 2004, the gene had long been a cultural icon; the HGP itself participated in the hype that the sociologists of science Dorothy Nelkin and M. Susan Lindee (1995) called the “DNA Mystique”—one that involved a basic posture of genetic essentialism and offered an overly optimistic picture of the future clinical applications of genetic research (Hubbard and Wald 1993).
In spite of the increasing convergence of neuroscience and genomics, by the late 1990s the brain had largely supplanted the genome as the source of foundational explanations for human features and behaviors as well as the source of scientific hype. Such a shift may appear justified. Since the brain and the nervous system seems more directly relevant than genetics to many of the philosophical and ethical questions raised by the Western philosophical tradition, including issues of personal identity, they are more likely to be felt as constitutive of one’s self. Some occasions may prompt or sustain such a special relation. Thus, while people with genetic afflictions have been observed to “hiss and boo at pictures of genes or enzymes that cause these afflictions,” sufferers of mental illnesses react to brain images of patients diagnosed with depression of schizophrenia with “care and concern,” as if the image represented both the affliction and “the suffering of the afflicted” (Dumit 2003, 44–45).
As we shall see, such differences in attitude, as well as the precedence of brain over genes as far as human individuality is concerned, have deep roots in the history of notions of personal identity. Yet, again, this does not mean that brainhood is hegemonic. For example, on the basis of ethnographic research in a neuro-oncology clinic, the sociologist of science Sky Gross (2011) shows that while most brain tumor patients admit that the brain is the seat of “who they are,” they tend to consider it as just another diseased organ. We must insist on this point, to which we return below, because there has been concern about the empirical accuracy and the interpretive traction of “totalising accounts of the neurological as determining subjectivity, as if the brain is the epicentre of personhood” (Pickersgill, Cunningham-Burley, and Martin 2011, 362).
Notions such as cerebral subject, brainhood, or neurochemical self are not meant to suggest that a neurobiological perspective dictates views of subjectivity always and absolutely but that, in some times and contexts, it effectively does, occasionally at a very large scale. The sociologist Nikolas Rose’s example for neurochemical selves is the well-documented fact that millions of people around the world have come to think about sadness “as a condition called ‘depression’ caused by a chemical imbalance in the brain and amenable to treatment by drugs that would ‘rebalance’ these chemicals” (Rose 2003, 46; see here Chapter 3). However, as with “genetic self,” it should be obvious that, in real life, everyday ontologies (in the loose sense of mainly implicit “theories about being”) coexist, both inside a society and within a single individual. We shift registers in our ways of acting, experiencing, and interacting as well as thinking and speaking about ourselves and others, and this is why psychotherapies and antidepressants can live happily together, if perhaps not “ever after.”
The coexistence of such ontologies and their related practices corresponds to what happens in the diachronic and historical dimension. When a phenomenon or area of knowledge is neurologized, it does not ipso facto cease to be what it previously or otherwise was. For example, in the neurobics industry examined below, “brain jogging” simply translates into training the mind, and the exercises proposed are basically the same as those long peddled to improve mental capacities. Nevertheless, when these exercises are relabeled neurobics, they realize the ideology of the cerebral subject. It may be a superficial instantiation of that ideology, where the neuro is no more than a marketing gimmick. That, however, does not abolish the fact that what is sold and bought belongs to a neuro business based on people believing (or at least being told) that they are essentially their brains.
In a medical context, individuals may share a condition but not its interpretation. For example, in her study of bipolar disorder patients, the anthropologist Emily Martin (2009) describes the clash between a dominant reductionist model and the individuals who challenged the idea that neurobiology sufficed to explain their experience. Grassroots diversity thus coexists with a more homogenous official discourse. As is well known, much of psychiatry, including scientists at the head of major national mental health agencies, assert that there are no mental diseases, only brain diseases. Different consequences could follow—one being an emphasis on pharmacological medication and a restriction of access to psychotherapies, with a huge impact on people’s lives. A development such as the neurodiversity movement (Chapter 3 here) can only happen in a world where “mental disorders” have been redefined as “brain disorders that primarily affect emotion, higher cognition and executive function” (Hyman 2007, 725). In such a context, psychiatric patients are approached mainly as cerebral subjects, and this may contribute to modulate their self-understanding and how they live their lives.
However, the neuroscientific consensus does not automatically translate into public consent, and research confirms commonsense intuitions about the variety and coexistence of views and practices of the self. Emily Martin (2010, 367) noted that the uptake of brain-based explanations outside the neurosciences and in the wider public is “uneven” and that there is no full takeover by “a newly dominant paradigm.” Such heterogeneity exists side by side with the development of brain-centered interventions in medicine, in the workplace, and in schools—interventions that may take place independently of how particular individuals understand themselves.
The sociologist Martin Pickersgill and his colleagues (2011) investigated how people draw on neuroscience and neuro ideas to articulate self-understanding. Working with patients suffering from epilepsy, head injury, and dementia as well as with neuroscientists and other professional groups (teachers, counselors, clergy, and foster care workers), they showed that individuals turn their attention to (popular) neuroscience mainly after some kind of neurological event, for example, a brain hemorrhage. This contingent interest, however, does not imply attributing to neuroscience an absolute capacity to define or explain subjectivity. Overall, attitudes are governed by pragmatism and personal relevance; rather than altering notions and practices of the self, neuroscientific concepts “seemed to simply substantiate ideas already held by individuals.” The brain thus emerges “as an object of mundane significance,” which sometimes helps one understand oneself but is “often far from salient to subjective experience” (Pickersgill, Cunningham-Burley, and Martin 2011, 358, 361–362). Using online questionnaires with Dutch adults diagnosed with ADHD, the sociologists Christian Broer and Marjolijn Heerings (2013) also noticed that although those individuals were interested in neurobiological explanations, they did not reduce their condition to a brain phenomenon. In the framework of the Dutch tradition of public debate and dissent over mental health issues, neurobiology did not colonize subjectivity and was invoked in different ways: as explanation or excuse but also as opening the possibility of governing the self “in the name of the brain” (Rose and Abi-Rached 2013, 8). A study of adults diagnosed with ADHD documented parallel discourses of self-regulation that did not rely on “brain talk” (Broer and Heerings 2013, 61). In Canada, adults diagnosed with major depression or bipolar disorder were asked their ideas about the potential role of neuroimages in stigma mitigation, moral explanations of mental illnesses, and the legitimation of psychiatric symptoms. The resulting interviews show the complex and ambivalent ways in which individuals integrate brain-based notions of mental disorders into their self-understanding; some assumed neurobiological explanations of their disorder yet struggled against pharmaceutical treatments (Buchman et al. 2013).
Studies with other populations produce similar results. Adolescents’ explanations of their own behaviors and mental health issues emphasize personal, familial, and social contexts, rarely incorporating the brain or biology (Choudhury, McKinney, and Merten 2012). This may be partly attributable to a lack of information. When informed, however, teens do not refuse to include biological factors in their understanding of adolescent behavior. Rather, confronted with an overwhelmingly negative view of the “teenage brain” as defined by the incapacity to exert control over high-risk pleasure-seeking behaviors or by a deficit in the synchronization of cognition and affect (e.g., Steinberg 2008), they call for neuroscience to contribute to a positive view of their age of life and, in any case, do not generally see behavior in purely biological terms. In turn, on the basis of conversations with three groups (undiagnosed, diagnosed with ADHD but medicated, or diagnosed but not medicated) Ilina Singh (2013) described how children, including those of the two latter groups, did not subordinate their I to brain-based explanations but tended to depict the role of the brain in their lives in ways that emphasized personal agency. She thus confirmed that encounters with neuroscientific discourses or technologies do not necessarily cerebralize subjectivity. Similarly, fieldwork in a laboratory conducting fMRI research with children diagnosed with ADHD, learning disabilities, autism, and Tourette syndrome documented how subjects “appropriate lab-based descriptions of neurological difference to their own purposes, claiming a positive identity for themselves,” and how “the effects of laboratory research and the metaphors used to describe them may serve expansive purposes in the practices of those who see their subjectivity embedded in research findings” (Rapp 2011, 3, 22).
In a review published in 2013, Cliodhna O’Connor and Helene Joffe examined the empirical evidence for three frequent claims: that neuroscience fosters a conception of the self based in biology, that it promotes conceptions of individual fate as predetermined, and that it attenuates the stigma attached to particular social categories. They concluded that “claims that neuroscience will dramatically alter people’s relations with their selves, others and the world are overstated. In many cases, neuroscientific ideas have been assimilated in ways that perpetuate rather than challenge existing modes of understanding” (O’Connor and Joffe 2013, 262). Such bricolage will not surprise historians, who are used to the intertwining of continuities and discontinuities. They are nonetheless valuable for deflating fantasies about the subjective impact of the neuro and thus for disrupting “over-theorised accounts of the impact of ideas about the brain on personhood” (Pickersgill, Cunningham-Burley, and Martin 2011, 362).
A lot of this sociological literature has referred to our ideas about brainhood and the cerebral subject. We are thankful for such references but must also point out some misconceptions. One of us (FV) has been described as “one of the most outspoken critics of a cultural hegemony of the ‘neuro’ ” (Besser 2013, 48). However, arguing that the neuroscientific level of explanation is not always the most appropriate or questioning claims that the neurosciences will radically alter our view of the human is not the same thing as maintaining that the neuro is hegemonic. Another misinterpretation concerns the level at which the neuro exerts its power. Notions of the self and identity are not limited to self-conceptions, which is what the sociological research we just mentioned is about. When, to give just o...

Table of contents

  1. Cover
  2. Half Title
  3. Series Announcement Page
  4. Title Page
  5. Copyright
  6. Dedication
  7. Contents
  8. To Begin With
  9. 1. Genealogy of the Cerebral Subject
  10. 2. Disciplines of the Neuro
  11. 3. Cerebralizing Distress
  12. 4. Brains on Screen and Paper
  13. “Up for Grabs”
  14. Acknowledgments
  15. Notes
  16. Bibliography
  17. Index
  18. Series Page