Psychology

Visual Perception

Visual perception refers to the process by which the brain interprets and makes sense of visual information received from the eyes. It involves the organization, identification, and interpretation of visual stimuli to understand the surrounding environment. This process encompasses various aspects such as depth perception, color perception, and motion perception.

Written by Perlego with AI-assistance

12 Key excerpts on "Visual Perception"

  • Companion Encyclopedia of Psychology
    • Andrew M. Colman(Author)
    • 2018(Publication Date)
    • Routledge
      (Publisher)
    3.1 Fundamental Processes in Vision
    Peter C. Dodwell Queen's University , Ontario, Canada
    1. Nativism and empiricism
    2. The physiology of the visual system
    3. The visuaf world: space and object perception
    4. Perceptual illusions
    5. Gestalt psychology
    6. Gibson's perceptual theory
    7. Perceptual plasticity and learning
    8. The nature of perceptual learning
    9. Binocular vision and depth perception
    10. Conclusion
    11. Further reading
    12. References
    Reading a book, hearing a familiar song, recognizing a friend's face-all are characteristic acts of perception which occur so effortlessly that we take them for granted. Yet the study of perception is a major field in modern psychology, and one that is full of new and interesting challenges. In order to understand the processes of seeing we have to understand the nature of the physical events that give rise to perception, the physiological processes that record them, and the psychological abilities of the perceiver that make sense of them.
    Perception is the primary process by means of which we obtain knowledge of the world: it has been estimated that more than 80 per cent of it is accounted for by vision. Certainly the visual system is by far the most thoroughly studied of the senses (conventionally five are recognized: sight, hearing, taste, touch, and smell) and the best understood. Perception is a skill, or set of skills, not simply the passive recording of external stimulation (Gibson, 1966). A perceiving organism is more like a map-reader than a camera. What we so easily accept in perceiving and understanding the world involves complex processes at many levels. Psychological research on seeing extends all the way from the study of the electrical activity of single cells in the eye or brain, to colour vision, the perception of objects and events, learning to read, and understanding the complexity of an air traffic controller's video console.
  • Essential Cognitive Psychology
    2 Visual Perception
    When I was a student I remember attending a lecture about the human visual system. Midway through the lecture a visiting professor stood and walked to the podium. Addressing the speaker he said, “Look, I can see, I’ve walked up here, what more is there to know?” This was a profoundly ignorant statement which totally failed to appreciate the scientific challenge posed by the study of perception—in this case the visual system. The things around us do not automatically indicate to us what they are; our perception of the world is built up by internal processes which operate on an initial input that is far removed from what our sense organs initially register. Vision, for example, begins with a two-dimensional image on the retina but ends up as a three-dimensional scene in which there is depth, colour, movement, and so on. Similarly, hearing begins with the mechanical stimulation of cochlea hair cells by sound waves but what we hear is sufficient to allow us to appreciate the complex sounds of continuous speech.
    How then do we perceive the world? The prevailing view is to consider the various forms of perception as instances of an information processing system. It is proposed that perception begins with various analyses of the initial sensation which become progressively more complex until a percept is formed. A percept is the internal representation derived from the initial pattern of stimulation and it is this that serves as the basis for subsequent identification processes, i.e. determining what an object looks like, sounds like, smells like, and so on.
    Forms of perceptual process
    We have, as you all know, five senses—vision, hearing, touch, smell and taste. Cognitive psychologists have been very uneven in the time they have devoted to the study of our senses. Most work has been carried out on our visual system because of the dominant role it plays in communication. Next comes hearing, followed by touch, smell and taste. Research into hearing has been quite substantial owing to the need to understand speech perception but the other three senses have received relatively little attention. However, research into touch (often known as haptic perception) has received considerable impetus from its relevance to communication aids for the blind. Smell and taste, although subject to some investigation within cognitive psychology, have been of more interest to physiologists. In this book we will be primarily concerned with Visual Perception (this chapter) and with hearing in relation to speech perception (Chapter 10
  • Television Aesthetics
    eBook - ePub

    Television Aesthetics

    Perceptual, Cognitive and Compositional Bases

    • Nikos Metallinos(Author)
    • 2013(Publication Date)
    • Routledge
      (Publisher)
    1 Visual Perception Principles: Defining the Visual Field of the Television Screen
    Any attempt to define the visual field of the television screen is futile unless the main theories, the organs, and the processes involved in Visual Perception are thoroughly defined and clearly understood. The physiology of the human perceptual organs, mostly the eyes, the ears, and the brain, and their specific functions are discussed in detail in Part II of this book. This chapter briefly examines only the basic anatomy of the eyes to explain how we perceive televised images. Specifically, this chapter introduces the following topics as they relate to the perception of television images: (a) basic approaches of perception, (b) visual stimulation (which includes also the visual sensory processes), (c) the perceptual process of television images, (d) the perception of elements within the visual field (which expands the discussion to include light and color), and (e) the perception of holographic and three-dimensional visual displays.
    BASIC APPROACHES TO Visual Perception
    Perception, in general, is a process in which objects and events in the environment are received by the sensoric perception organs as stimuli. These organs then organize, codify, and process the stimuli to the brain, where they are turned into structural perceptions, or cognitive units. The normal operation of this process depends on various key factors such as the nature of the stimuli, heredity, memory, and learning. Perception, therefore, is a product of both the physiological and psychological processes.
    In Visual Perception we usually look at the external world (the visual world) to assign meaning to a variety of sensory impulses. We attempt to organize these impulses to identify and understand them. Depending on how familiar we are with the environmental stimuli, we ask, first, what the form of the particular stimulus is. We then try to define its depth and location. Finally, in our effort to determine its nature in relation to the environment, we wonder what the stimulus is doing; whether it is stationary or in motion.
  • Essential Cognitive Psychology (Classic Edition)
    2 Visual Perception
    When I was a student I remember attending a lecture about the human visual system. Midway through the lecture a visiting professor stood and walked to the podium. Addressing the speaker he said, ‘Look, I can see, I’ve walked up here, what more is there to know?’ This was a profoundly ignorant statement which totally failed to appreciate the scientific challenge posed by the study of perception—in this case the visual system. The things around us do not automatically indicate to us what they are; our perception of the world is built up by internal processes which operate on an initial input that is far removed from what our sense organs initially register. Vision, for example, begins with a two-dimensional image on the retina but ends up as a three-dimensional scene in which there is depth, colour, movement, and so on. Similarly, hearing begins with the mechanical stimulation of cochlear hair cells by sound waves but what we hear is sufficient to allow us to appreciate the complex sounds of continuous speech.
    How then do we perceive the world? The prevailing view is to consider the various forms of perception as instances of an information processing system. It is proposed that perception begins with various analyses of the initial sensation which become progressively more complex until a percept is formed. A percept is the internal representation derived from the initial pattern of stimulation and it is this that serves as the basis for subsequent identification processes, i.e. determining what an object looks like, sounds like, smells like, and so on.
    Forms of perceptual process
    We have, as you all know, five senses—vision, hearing, touch, smell, and taste. Cognitive psychologists have been very uneven in the time they have devoted to the study of our senses. Most work has been carried out on our visual system because of the dominant role it plays in communication. Next comes hearing, followed by touch, smell, and taste. Research into hearing has been quite substantial owing to the need to understand speech perception but the other three senses have received relatively little attention. However, research into touch (often known as haptic perception) has received considerable impetus from its relevance to communication aids for the blind. Smell and taste, although subject to some investigation within cognitive psychology, have been of more interest to physiologists. In this book we will be primarily concerned with Visual Perception (this chapter) and with hearing in relation to speech perception (Chapter 10
  • Neuroergonomics
    eBook - ePub

    Neuroergonomics

    A Cognitive Neuroscience Approach to Human Factors and Ergonomics

    • A. Johnson, R. Proctor, A. Johnson, R. Proctor(Authors)
    • 2013(Publication Date)

    2

    Cognitive Neuroergonomics of Perception

    Jacob Jolij, Addie Johnson and Robert W. Proctor
    Perception is the process of transforming sensory input into internal representations to guide cognition and action. Understanding this process is vital for designers of information systems and interfaces. After all, to understand what people do with information, it is necessary to know what information is available to them perceptually. Traditionally, perception, cognition and action have been treated as fairly independent processes in which the perceptual systems—located in the sensory cortices of the occipital (vision), temporal (vision and audition) and parietal (vision and somatosensation) lobes—carry out their respective tasks and pass information to higher, cognitive areas located in the frontal lobe. That way of conceptualizing perception, however, has been challenged by developments in the cognitive neuroscience of perception, which suggest that lower-level processes in the perceptual systems receive feedback from higher-level processes. Indeed, some have gone so far as to propose that what we think, know and feel may change the way we perceive the world by directly altering our perceptual processing (e.g. Stefanucci et al., 2011). Moreover, perceptual processing is highly dynamic: perceptual learning occurs on a continuous basis, and the different senses show crosstalk in which vision informs audition, touch informs vision and so forth (e.g. Grahn et al., 2011).
    Neuroimaging technologies such as functional magnetic resonance imaging (fMRI) and new algorithms to analyse electroencephalography (EEG) data on a single-trial level allow us to view the workings of the perceptual systems with ever-increasing accuracy. Using fMRI-decoding, a technique for deriving mental states from looking at brain activity, for example, it is possible to deduce what a person is viewing, thus allowing the researcher to read the ‘mind’s eye’ (Miyawaki et al., 2008; Tong & Pratte, 2012). Novel analysis methods are now being employed to achieve similar things using EEG, with some degree of success (Bobrov et al., 2011). The possibility of determining in such a direct manner what people are processing can be expected to open up new possibilities for their interactions with computers and other machines.
  • Cortical Functions
    eBook - ePub
    • John Stirling(Author)
    • 2020(Publication Date)
    • Routledge
      (Publisher)
    7 Visual mechanisms and perception Introduction Sensation and perception Sensory processing: from eye to brain Colour vision Perceptual processes Object recognition: the WHAT stream and agnosia Spatial functions and the WHERE stream Specific disorders of spatial processing Evaluation of spatial perception and the WHERE stream Summary Introduction Of all the senses, vision, in the view of many people, is the most remarkable. Think for a minute of the processing that is required as you read this page. The lines and angles of print reflect light into your eyes. The light excites cells in the retina, which send nerve impulses deep into the brain. From here, the ‘neural messages’ undergo several further stages of processing. There are separate cortical regions to deal with colour and movement, and additional regions to coordinate reading, object recognition and probably facial recognition too. Yet if you close your eyes for a moment, turn round a few times, then open them again, your view of the world is, to all intents and purposes, instantaneous and effortless! Sensation and perception To simplify matters, I will distinguish between the sensory mechanisms of vision, and the perceptual processes which permit recognition of the visual input. ‘Visual sensation’ is about input ‘getting registered’ in the brain. Perception is concerned with the interpretation of the stimulus. To understand the former we need to know a little about the structure of the eye, and the route that visual input takes from the retina to the occipital cortex. To understand the latter (or perhaps begin to understand, since so much more is yet to be learned by psychologists), we will consider some research findings from case studies of people who have lost certain perceptual functions, usually after damage or disease to key cortical regions
  • Attention, Perception and Memory
    eBook - ePub

    Attention, Perception and Memory

    An Integrated Introduction

    Chapter 4Visual Perception and memory: Making sense of the visual environment

    • A walk in the park
    • The problem for perception
    • Biological bases of Visual Perception
      • Knowing what, where and how
      • The binding problem
    • Cognitive aspects of Visual Perception
    • Depth perception—binocular cues
      • Stereopsis
      • Random dot stereograms
      • Binocular rivalry: The role of attention in perception
    • Pictorial or monocular cues to depth, size and distance
      • Familiar size
      • Occlusion and texture
      • Linear perspective
    • Other cues
      • Shape and shading
      • Depth from motion
      • How do we know what moves?
    • The constructivist approach to perception
      • The importance of context
    • The ecological approach to perception
    • The visuo-spatial sketch pad
    • Summary
      • Self-assessment questions
    • Further reading

    A walk in the park

    AS YOU WALK THROUGH THE PARK you see the grass stretching out into the distance in front of you with trees and plants distributed in it. Your impression is not of a flat picture, but of a three-dimensional space within which you can move. There are children playing in the distance and a jogger running toward you. As you continue walking, you notice birds picking through the litter under the trees: they are well camouflaged; it is not until they move that you can pick out their shapes. As you walk along you spot something on the ground. Initially it is not clear what it is, then you realise it is a milk carton that has fallen at an odd angle.

    The problem for perception

    The processes involved in Visual Perception enable us to act and react to the visual environment safely and accurately. We need to know what things are and where things are, and where we are in relation to them. The problem for Visual Perception is to make sense of the sensory data that is detected as patterns of light falling on the retina. The retina is a two-dimensional, flat surface, yet we perceive the world in three dimensions: how is this achieved? Some perceptual processing is a direct outcome of the biological and physiological nature of the visual system, whereas other perceptual processes involve the use of knowledge gained from experience with the visual world. Together with attentional and memory processes, perceptual processing gives rise to our experience of the visual objects and events around us. Although we are only concerned with vision in this chapter, it is important to remember that many objects in the environment have perceptual properties from other modalities, in that they possess auditory, tactile and other sensory properties as well. In later chapters we shall examine hearing and touch and cross-modal effects in perception. So, another problem for perception is to combine information about the properties of objects and the environment.
  • Visual Perception
    eBook - ePub

    Visual Perception

    An Introduction, 3rd Edition

    • Nicholas Wade, Mike Swanston(Authors)
    • 2013(Publication Date)
    • Psychology Press
      (Publisher)
    This chapter provides an overview of central issues in the study of Visual Perception, many of which will be discussed in more detail in later chapters. It is important to understand the functions that any visual system must perform if there is to be coordinated, effective action, and the problems of devising explanations for how this comes about. If perception is to be explained, appropriate measurements of its characteristics must be obtained, and related to the information potentially available from the physical environment. Each of these issues contributes to the general framework of ideas that guides the investigation of vision.

    Functions of Visual Perception

    We all enjoy contemplating the experiences provided by our senses, and much of our language is associated with describing them. In human cultures considerable effort is devoted to enhancing perceptual experiences by decorating our bodies and our surroundings and by producing artefacts (like pictures) to stimulate the senses and to channel our contemplations. With so much emphasis on extending our perceptual experiences it is tempting to think of their function as enabling us to enjoy and describe them. In evolutionary terms the function of perception is much more mundane – it is to enable us to interact with the objects in the world surrounding us. More specifically, we use our perceptions to guide our behaviour. We use vision to determine the location of objects with respect to us, so that we can approach them, grasp them, cast them aside, or avoid them as appropriate for our survival. Some objects, like food, will be particularly significant for our sustenance, and we learn how to recognise them from all sorts of positions. Perceiving the location of objects and recognising them is achieved when we are still or moving, or if the objects themselves move. Accordingly, we need to be able to distinguish between static and moving objects whether we ourselves are static or moving.
    Action and recognition
    As we have emphasised above, vision does not occur in isolation, although we often describe it as though it does. It is integrated with the other senses and also with the movements we make. The actions we perform are based on vision and vision, in turn, is influenced by the actions we perform. This can be illustrated in terms of what you are doing while reading this text, whether from a traditional book, an e-reader tablet or a computer monitor. The surface you are reading from is likely to be in a fixed position and appear stationary. This is the case even though the reading surface and the text on it are moving over your eyes as a consequence of eye movements. We now know quite a lot about the ways the eyes move when reading text and also when viewing scenes: they tend to remain relatively still for a few tenths of a second (called fixation) and then flick rapidly (or saccade) to a new location, with this sequence repeated about three times every second. Despite these jerky movements of the text over your retinas the experience is of a stable surface with stable text. For this to happen there must be an intricate internal integration between the pattern of stimulation on the retinas and the signals for contracting the eye muscles. The situation becomes even more complex when we take gross movements of the body into account; the head can move with respect to the trunk and the whole body can locomote through space.
  • Landscape: Pattern, Perception and Process
    • Simon Bell(Author)
    • 2012(Publication Date)
    • Routledge
      (Publisher)
    Language is one of the main uses of sound, but we tend to convert words into images before we can make full use of them. Sight, by comparison, not only permits a much greater variety of information to be received, but it leads directly to the means by which we think and express ourselves. In this respect, the sensory input about the world is much more than mechanical reception of data, later processed by the brain as a separate activity. For example, at the same time as we perceive the world, we also project our subjective feelings and preconceptions onto it. This is why concepts such as ‘landscape’ or ‘wilderness’ are as much states of mind as they are physical entities; this fact has major implications for aesthetics, and for the meaning of the term ‘environment’ (see Chapter Three).
    The eye may, in some respects, be constructed and function like a camera but, being directly connected to the brain, it also interacts with the mental processes that use it. Most of us rely so much on a constant flow of sensory information that if deprived of it, perhaps by being shut in an empty, dark and soundless room for a period, the mind would start to supply images to fill the void. The extreme results of this would be hallucinations. Hence, it is clear that perception is not just the passive reception of information.
    Blind people are excluded from the perception of the world available to the sighted. However, they compensate by developing the other senses, especially touch, to much greater degrees than most people. Similar compensations occur for those who cannot hear or speak. The extent to which the ‘mind’s eye’ exists for blind people, especially those blind from birth, is difficult for sighted people to appreciate.
    The rest of this chapter will concentrate on vision, because it is so important, as a perception and a thinking medium for most people. In order to understand Visual Perception, we must know something of the physiology and psychology of the mechanisms involved. Recent research is uncovering much about the detailed processes of image formation and the way in which the brain cells build up an image of the world. An extremely helpful synthesis has been written by the British team of Vicki Bruce, Patrick Green and Mark Georgeson, from which much of the following discussion is sourced.
  • BIOS Instant Notes in Cognitive Psychology
    • Jackie Andrade, Jon May(Authors)
    • 2004(Publication Date)
    • Taylor & Francis
      (Publisher)
    optic flow patterns. In contrast, proponents of the information processing approach (e.g. Gregory) argue that perception involves top-down processes of hypothesis formation, in which memory and expectations help us interpret sensory information from the outside world, as well as bottom-up processes of receiving and organizing sensory information. The goal of perception is to identify and categorize objects in the environment, creating meaningful and useable internal representations of the external world (see Section E ).
    We only become aware of a small amount of the information that bombards our senses. If you are concentrating on reading this sentence, you probably will not have noticed the sensation of your clothes against your skin or the buzz of background traffic. Much research and debate has focused on the cognitive processes by which the vast amount of incoming information is reduced to the small amount of important information that we need to function. Even the very early stages of sensation and perception involve filtering incoming light and sound information to extract the most informative portions (e.g. edges in a visual scene). Selective attention processes aid this information filtering (see Section C ). Attended information is further reduced by forgetting, leaving meaningful and useful information in working and long-term memory (see Section D ).
    Sperling’s (1960) partial report technique provided evidence for the brief persistence in memory of sensory information that has survived perceptual filtering processes. Sperling used a tachistoscope (a device allowing brief and accurately timed presentations of stimuli) to present arrays of letters for about 50 msec. Although participants could usually remember only about four of twelve letters in an array, they typically reported seeing more letters. Sperling hypothesized that information about all the letters persisted after the stimulus was removed, but did not last long enough for all the letters to be reported. He tested his hypothesis by playing a tone after presenting the letter array, to tell participants which row of letters to report. When the tone was played immediately after the letter array, participants’ performance indicated that around nine letters were available for mental inspection and report, although performance declined dramatically if there was a delay between the letters and the tone. Sperling’s findings suggested that detailed visual information persists for about half a second in iconic memory. The equivalent auditory store is called echoic memory (see Topic D2
  • Mind Computation
    eBook - ePub
    • Zhongzhi Shi(Author)
    • 2017(Publication Date)
    • WSPC
      (Publisher)
    Chapter 5 Visual Perception Visual information processing is the process of finding what the objects are and where they are in the scenes from the images. From the input image to the description of the scene, there exists a huge gap. It needs a series of information processing and process understanding to bridge the gap. Understanding the nature of this process is the key to uncover the mystery of the vision. However, we are far from understanding these at present. Vision mainly has two functions. The first is the perception of objects, i.e., what is it? The second is spatial perception, i.e., where is it? There are concrete facts that show that different brain systems are involved in the above two functions. 5.1 Visual Cortex Area Visual cortex has two main kinds of nerve cells: stellate cells and cone cells. The axons of stellated cells contact with projecting fibers. A cone cell is triangular, with the tip moving towards surface layer emitting upwards a long dendrite and the fundus issuing several dendrites that contact breadthwise. Visual cortex, like other cortex areas, includes six layers of cells, denoted by I–VI from the surface to the inner layer. The trunks of cortex cell’s tubers (dendrites and axons) are all distributed in a vertical direction to the cortex surface; the branches of dendrites and axons are distributed breadthwise in different layers. The different cortex areas contact one another by axons through deep white matter, while the inner layers of cortexes contact through transverse branches of dendrites and axons in cortexes. In recent years, the scope of visual cortexes has already extended to many new cortex areas, including the parietal lobe, the occipital lobe and part of amounting to 25 the frontal lobe [ 496 ]. Besides, it has seven visual association areas, which have visual and other sense or movement functions. All visual areas account for 55% of the area of the new brain cortex
  • Acquisition and Performance of Sports Skills
    • Terry McMorris(Author)
    • 2014(Publication Date)
    • Wiley
      (Publisher)
    The organization, integration and interpretation of sensory information is thought to take place primarily in the prefrontal cortex, but it draws upon the sensory information held in the specific sensory areas of the cortex and information from LTM contained in several areas of the brain. It should, therefore, be of no surprise to find that fMRI and PET studies have shown considerable activation of the prefrontal cortex during perceptual tasks. The parietal cortex has also been shown to play a role in perception and is particularly active in tasks where the individual switches attention for one part of the display to another, for example, a defender in hockey switching between attending to her/his immediate opponent and the runs of other attackers.

    Definition of perception

    Based on the above, we can define perception, according to information processing theory, as being the organization, interpretation and integration of sensory information. Kerr (1982) provides a similar definition but includes the word ‘conscious’. Although information processing theorists would argue that, most of the time, perception is a conscious process, recent research on learning and anticipation has shown that it can take place at a subconscious level.

    Signal detection theory

    As information processing theorists claim that perception is inferred, a number of theories have been developed to explain different aspects of the cognitive processes taking place. One of the first theories was Swets’ (1964; Swets and Green, 1964) signal detection theory. Swets realized that people live in an environment that is full of sensory information. He reckoned that the individual receives over 100 000 signals per second. These may be signals from the environment and/or from within the person themselves. Sport provides many examples of this and the problems it can cause. Think of a tennis player about to serve in a game on Centre Court at Wimbledon. What kinds of signals do you think the player will be receiving visually and auditorally? What kinds of internal signals might the player be receiving: e.g. will I win, will I play well? The problem facing Swets was how to explain how anyone can recognize relevant information against this background of signals, which he termed ‘noise
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.