1 /
The Sociology of the Mind
Why do we eat sardines yet never goldfish, ducks yet never parrots? Why does adding cheese make a hamburger a âcheeseburgerâ whereas adding ketchup does not make it a âketchupburgerâ?1 And why are Frenchmen less likely than Americans to find snails revolting? By the same token, how do we come to regard gold as more precious than water? How do we figure out which of the things that are said at a meeting ought to be included in the minutes and which ones are to be considered âoff the recordâ and officially ignored? And how do we come to ârememberâ things that happened long before we were born?
In its present state, cognitive science cannot provide answers to any of these questions. In order to even address them, we may very well need an altogether new vision of âthe mind.â
When we think about thinking, we usually envision an individual thinkerâa chess player analyzing his opponentâs last move, a scientist designing an experiment, an old man reminiscing about his childhood, a young girl trying to solve a mathematical problem. This vision, so powerfully captured by Auguste Rodin in his statue The Thinker, is a typical product of modern Western civilization, which practically invented individualism. Since the late seventeenth century, it has been bolstered by the âempiricistâ theories of knowledge developed by the British philosophers John Locke and George Berkeley, who posited a blank mind, a tabula rasa, upon which the world impresses itself through our senses.
Yet while cognitive individualism2 still dominates the popular vision of thinking, modern scholarship strongly rejects such a highly personalized view of the mind. Aside from some small pockets of individualistic resistance in philosophy, economics, and psychoanalysis, few students of the mind today base their general vision of thinking on the image of a solitary thinker whose thoughts are a product of his or her own unique personal experience and idiosyncratic outlook on the world. In fact, if scientists were to study idiosyncratic thought patterns that apply only to particular individuals, we probably would not even consider their findings âscientific.â
The rise of modern cognitive science3 coincides with the decline of the Romantic vision of the individual thinker and a growing interest in the non-personal foundations of our thinking. Inspired by RenĂ© Descartesâs and Immanuel Kantâs ârationalistâ visions of innate mental faculties that precede our sensory experience of the world and even condition the way we actually organize it in our heads, most cognitive scientists today reject Lockeâs and Berkeleyâs visions of an a priori empty mind.4 The move away from empiricism toward rationalism has placed reason instead of experience at the heart of the process we call âthinking.â More important, however, it has also meant substituting the human for the individual as the primary locus of cognition.
It is hard not to notice the dramatic shift of attention from the idiosyncratic to the universal in the modern study of the mind. It is our cognitive commonality as human beings, rather than our uniqueness as individual thinkers, that is at the center of the study of cognition today, and modern theories of the mind typically play down our cognitive idiosyncrasies, highlighting instead what we share in common. As evident from the fact that the theoretical agendas of Noam Chomsky and Jean Piaget still dominate much of modern linguistics and developmental psychology, this trend is most visibly epitomized in the current interest in the common constitution of our verbal apparatus as well as the seemingly universal process of our cognitive development.
Cognitive universalism is clearly the dominant vision of the mind in modern cognitive science, much of which revolves around the search for the universal foundations of human cognition. Even psychologists, philosophers, linguists, and students of artificial intelligence who do not study the brain itself nonetheless claim to explore the way humans think. As evident from their general indifference to their research subjectsâ biographical background, most cognitive scientists today assume a universal, human mind.
It is certainly such universalistic sensitivity that allows cognitive scientists to unravel the universal foundations of human cognition. It is precisely their concern with our cognitive commonality that has helped neuroscientists, psychologists, linguists, and students of artificial intelligence to discover universal patterns in the way we form concepts, process information, activate mental âschemas,â make decisions, solve problems, generate meaningful sentences from âdeepâ syntactic structures, access our memory, and move through the various stages of our cognitive development. Yet it is precisely this commitment to cognitive universalism that is also responsible for what is probably cognitive scienceâs most serious limitation. While it certainly helps cognitive scientists produce a remarkably detailed picture of how we are cognitively âhard wired,â it also prevents them from addressing the unmistakably non-universal mental âsoftwareâ we use when we think.
Thus, their almost exclusive concern with our cognitive commonality as human beings prevents cognitive scientists from even addressing major cognitive differences that do not result from any fundamental biological differences such as those between normal adults and children, the brain damaged, the senile, or the mentally retarded. This presents the modern science of the mind with a very serious problem since, unlike the way we typically contrast human and animal (or adult and infant) cognition, we certainly cannot attribute the difference between the ancient Roman and present-day Italian visions of the universe (or between the ways liberals and conservatives view art), for example, to any major difference in their genetic makeup or the physiology of their brains.
It is hardly surprising, therefore, that some rather critical aspects of our thinking are still largely ignored by cognitive science. After all, with the exception of cultural anthropologists and cross-cultural psychologists, most modern students of the mind tend to ignore differences in the way we thinkâdifferences not only among individuals but also among different cultures, social groups, and historical periods. As a result, few cognitive scientists today would even consider addressing, for example, the difference between the ways in which gender is conceptualized in California and in Yemen, in which Catholics and Buddhists (or peasants and academics) envision God, or in which most Europeans viewed disease in the early thirteenth century and today. Nor, for that matter, can they help us understand why we reckon time in terms of hours and weeks and associate doves with peace. Such intellectual blind spots certainly leave us with less than a truly comprehensive science of the mind.
When my daughter was six, we had our first talk about what she should do if anyone ever tried to abduct her. The very next morning she proudly recounted to me a dream she had that night about precisely such an attempt, which in fact failed because she managed to apply the skills I had taught her only the day before. Wasnât she lucky, she added, that she happened to learn those skills just hours before she needed to use them for the first time! I have told this story to many people and discovered that they almost all find it amusing. Yet there was nothing inherently funny about my daughterâs remark. In fact, very few people, if any, would have considered it funny only a hundred years ago, prior to the publication of Sigmund Freudâs The Interpretation of Dreams,5 which totally transformed the way we think about our dreams.
At the same time, however, while this should certainly remind us that the things we find amusing are not inherently (and therefore universally) funny, we should also recognize that what we are seeing here are more than just a bunch of unrelated individuals with some peculiar sense of humor that somehow happens to be shared by most of their contemporaries yet, for some odd reason, by no one older than their grandparents. In a similar vein, when we notice that many Americans find the idea of eating snails revolting, we should recognize that what we are seeing is more than just a random collection of individuals with some peculiar phobia that somehow happens to be shared by so many of their compatriots yet, for some odd reason, by only a few French.
The problem with cognitive science is that, except for work produced by cultural psychologists, cognitive anthropologists, and lately some developmental and social psychologists, it has thus far largely ignored the social dimension of cognition. A truly comprehensive science of the mind must also include a sociology of thinking6 that, by focusing specifically on the sociomental,7 would complement the efforts of psychology, linguistics, the neurosciences, and artificial intelligence to provide a complete picture of how we think.
Despite a long history of almost totally ignoring sociology, cognitive scientists need to be more open to what cognitive sociology8 can offer them. Like the other cognitive sciences, it certainly tries to stay away from our cognitive idiosyncrasies, yet whereas psychology or linguistics dwell almost exclusively on our cognitive commonality as human beings, cognitive sociology also highlights major differences in the way we think. In other words, it tries to explain why our thinking is similar to as well as different from the way other people think.
There are three rather distinct levels of analysis one can use for approaching cognition given the fact that we think both (a) as individuals, (b) as social beings, and (c) as human beings. Whereas cognitive individualism naturally addresses only the first of those three levels, cognitive universalism basically confines itself to the third. Each, therefore, is somewhat limited in its scope. In addressing the middle level, which covers the entire range between thinking as an individual and as a human being (thereby including, for example, thinking as a lawyer, as a German, as a baby boomer, as a Catholic, and as a radical feminist), cognitive sociology thus helps avoid the reductionistic tendencies often associated with either of those two extremes.
Only an integrative approach that addresses all three levels, of course, can provide a complete picture of how we think.9 While cognitive individualism may certainly shed light on the particular mnemonic techniques I use to remember the password to my electronic mail account and cognitive universalism may best explain how I generally store information in my brain, only a sociology of memory can possibly account for how I remember the Crimean War. By the same token, whereas in order to understand how we differentiate âfiguresâ from their surrounding âgroundâ we clearly need a psychology of perception, only a sociology of perception can account for a cultureâs general tendency to perceive children as resembling their mothers more than their fathers.
In highlighting the social aspects of cognition, cognitive sociology reminds us that we think not only as individuals and as human beings, but also as social beings, products of particular social environments that affect as well as constrain the way we cognitively interact with the world.10 In probing the social underpinnings of the mental, it thus brings to the foreground a largely neglected dimension of our thinking, the full implications of which cognitive science has yet to explicitly address. As such, it should certainly be an indispensable component of a truly comprehensive science of the mind.
Drawing upon a long sociological tradition most illustriously represented by Emile Durkheim, Karl Mannheim, George Herbert Mead, and Alfred Schutz, cognitive sociology recognizes the fact that we do not think just as individuals. Like the other cognitive sciences, it strongly rejects the extreme individualistic vision of the absolutely original solitary thinker, reminding me, for example, that it is not as an individual but as a product of a particular social environment that I dismiss the fundamentalist account of the current AIDS epidemic as sheer nonsense, and that if my ten-year-old son already knows that the earth is round and that the world is made up of atoms it is only because he happens to live in the twentieth century. It also helps remind me that the way I think about death, God, or sex, for example, is remarkably similar to the way so many other twentieth-century Westerners happen to think about those matters.
Recognizing our cognitive commonality entails rejecting the Romantic vision of some âmental Robinson Crusoeâ and remembering that even Crusoe, though far from any other Europeans, was actually still thinking and acting in an unmistakably eighteenth-century British manner. It also entails abandoning Lockeâs and Berkeleyâs cognitive empiricism and realizing that perceiving works of art as âPostimpressionistâ or âprimitiveâ has very little to do with our senses and everything to do with the impersonal, social categories into which we typically force our personal experience. Furthermore, it means noticing that we also think about a lot of things that we have not experienced personally. Engraved in my mind are not only sensory impressions of the letters I now see on my computer screen and the sound of my printer, but also the ideas of Darwin and Rousseau, whom I will never meet, as well as memories of the voyages of Columbus and Verrazano, which took place more than four hundred years before I was born. In short, I experience the world not only personally, through my own senses, but also impersonally, through my mental membership in various social communities.
Most of this, of course, attests to the ubiquitous role of language in our lives. Whereas perception alone would inevitably confine me to a strictly sensory experience of the world, language allows me to process reality conceptually and thereby also bypass my senses. In marked contrast to the absolutely personal nature of sensory perception, language is highly impersonal.11 When I use words such as âloyalty,â âarrogance,â âauthentic,â or âalienated,â for example, I am using unmistakably social ideas which ...