Psychology

Operant Conditioning

Operant conditioning is a learning process in which behavior is strengthened or weakened by the consequences that follow it. It involves the use of reinforcement and punishment to shape behavior. This concept, developed by B.F. Skinner, has been widely applied in various fields, including education, parenting, and therapy.

Written by Perlego with AI-assistance

10 Key excerpts on "Operant Conditioning"

  • Psychological Criminology
    eBook - ePub

    Psychological Criminology

    An Integrative Approach

    • Richard Wortley(Author)
    • 2023(Publication Date)
    • Routledge
      (Publisher)
    According to the Operant Conditioning model, behaviour is shaped by the consequences it produces. The consequences of behaviour can be rewarding (i.e., reinforcing) or unpleasant (i.e., punishing). Behaviour that is rewarded will be repeated while behaviour that is punished will be avoided. Skinner did not reject classical conditioning, but he regarded it as playing a relatively restricted role in learning. While classical conditioning explains behaviour as a reaction to the environment, Operant Conditioning explains behaviour as an action – or operation – upon the environment. While classical conditioning is concerned with what comes before behaviour, Operant Conditioning is largely concerned with what comes after behaviour. While classical conditioning is concerned with involuntary reflexes and physiological responses, Operant Conditioning is concerned with volitional behaviours. And while classical conditioning portrays organisms as passive receivers of learning, Operant Conditioning views organisms as active participants in learning. Principles of Operant Conditioning Skinner (1953) demonstrated the principles of Operant Conditioning in a series of experiments involving pigeons and rats. The general procedure for investigating the reinforcement of behaviour is as follows. A pigeon is placed in a box – sometimes referred to as Skinner box – which contains a disk that when pecked delivers food from a chute below. Left to its own devices and through accidentally pecking the disc from time to time, the pigeon eventually learns that each time the disc is pecked it will be rewarded with food. The behavioural response (R) produces a rewarding stimulus (S R). Next a red/green light is introduced into the Skinner box. Now the food is delivered only when the disc is pressed and the light is red. The red light signals the availability of the reward
  • Operant Conditioning
    eBook - ePub

    Operant Conditioning

    An Experimental Analysis of Behaviour

    • Derek E. Blackman(Author)
    • 2017(Publication Date)
    • Routledge
      (Publisher)
    This research is essentially the systematic development of the principles which have already been introduced to the reader in the earlier parts of this book. Operant Conditioning is an experimental analysis of the ways in which the behaviour which is emitted rather than elicited by external stimuli may nevertheless be a function of environmental events. Much of this approach to psychology has been achieved by the exhaustive study of the ways in which the emitted behaviour of animals in experimental situations comes under the control of different arrangements of reinforcers. The typical Operant Conditioning experiment specifies the form of behaviour which is to be associated with a reinforcer, rather than allowing the reinforcer to have coincidental relationships with any pattern of behaviour. To this extent, the experiments are similar to those discussed in the previous chapter. However, instead of developing interesting tricks, a pattern of behaviour is sought which it is possible to record easily and which the subject may emit as frequently as he 'wishes'. This has resulted in the detailed study of rather arbitrary patterns of behaviour which happen to be convenient both to experimenter and subject. In the case of rats and monkeys, this is to be found in the form of a press on a lever; with pigeons, a peck on a disc. So an experiment may investigate the effects of reinforcers which are dependent on a rat's lever-presses, these effects being expressed in terms of the frequency with which the lever is pressed. Of course, one is not suggesting that wild rats in their 'real' world go about pressing levers. However, it is argued that these patterns of behaviour may be taken as a model for other patterns of more 'normal' activity. The lever-press and the peck at a disc are operants, because this behaviour cannot be readily identified as being elicited by a preceding stimulus in the way that acid precedes salivation
  • Behavior Analysis and Learning
    eBook - ePub

    Behavior Analysis and Learning

    A Biobehavioral Approach, Sixth Edition

    • W. David Pierce, Carl D. Cheney(Authors)
    • 2017(Publication Date)
    • Routledge
      (Publisher)
    Reinforcement and Extinction of Operant Behavior 4
    1. Learn about operant behavior and the basic contingencies of reinforcement.
    2. Discover whether reinforcement undermines intrinsic motivation.
    3. Learn how to carry out experiments on Operant Conditioning.
    4. Delve into reinforcement of variability, problem solving, and creativity.
    5. Investigate operant extinction and resistance to extinction.
    A hungry lion returns to the waterhole where it has successfully ambushed prey. A person playing slot machines wins a jackpot and is more likely to play again than a person who does not win. Students who ask questions and are told “That’s an interesting point worth discussing” are prone to ask more questions. When a professor ignores questions or gives fuzzy answers, students eventually stop asking questions. In these examples (and many others), the consequences that follow behavior determine whether it will be repeated.
    Recall that operant behavior is said to be emitted (Chapter 2 ). When operant behavior is selected by reinforcing consequences, it increases in frequency. Behavior not followed by reinforcing consequences decreases in frequency. This process, called Operant Conditioning, is a major way that the behavior of organisms is changed on the basis of ontogeny or life experience (i.e., learning). It is important, however, to recognize that Operant Conditioning, as a process, has evolved over species history and is based on genetic endowment. Biologically, operant (and respondent) conditioning as a general behavior-change process is based on phylogeny or species history. In other words, those organisms, whose behavior changed on the basis of consequences encountered during their lifetimes, were more likely to survive and reproduce than animals that did not evolve such a capacity. Adaptation by operant learning is a mechanism of survival that furthers reproductive success.

    Operant Behavior

    Operant behavior is sometimes described as intentional, free, voluntary, or willful. Examples of operant behavior include conversations with others, driving a car, taking notes, reading a book, and painting a picture. From a scientific perspective, operant behavior is determined and lawful and may be analyzed in terms of its relationship to environmental events. Formally, responses that produce a change in the environment and increase in frequency due to that change are called operants. The term operant comes from the verb to operate and refers to behavior that operates on the environment to produce effects or consequences, which in turn strengthen the behavior. The consequences of operant behavior are many and varied and occur across all sensory dimensions. When you turn on a light, dial a telephone number, drive a car, or open a door, these operants result in visual clarity, conversation, reaching a destination, or entering a room. A positive reinforcer
  • Behavior Analysis
    eBook - ePub

    Behavior Analysis

    Foundations and Applications to Psychology

    • Julian C. Leslie, Mark F. O'Reilly(Authors)
    • 2016(Publication Date)
    • Psychology Press
      (Publisher)
    However, it remains true that it is often possible to use the reduction of a basic drive, such as the presentation of water for a thirsty animal or food for a hungry one, as a reinforcing operation, and that for practical purposes reinforcers are often "transsituational", that is effective in many situations. This is particularly true with human behavior which is strongly affected by conditioned reinforcement, which will be described in Section 2.9. 2.6 The Simple Operant Conditioning Paradigm The matters that we have been discussing in this chapter are variously referred to in the literature of psychology as simple selective learning, trial-and-error learning, effect learning, instrumental learning, instrumental conditioning, operant learning, and Operant Conditioning. We prefer to use the term simple Operant Conditioning for the situation where a reinforcing stimulus is made contingent upon a response that has a nonzero frequency of occurrence prior to the introduction of reinforcement. If a reinforcing stimulus is contingent upon a response, that stimulus will be presented if and only if the required response has been made. Formally, the simple Operant Conditioning paradigm is defined as follows. Each emission of a selected behavioral act or response is followed by the presentation of a particular stimulus. If this arrangement results in an increase in response frequency, relative to operant level and relative to other behavior occurring in the situation, the incorporation of the response into a behavioral loop and the narrowing of the topography of the response, then we say that the selected behavior is an operant response, that the stimulus functions as a reinforcer for that operant, and that what occurred was Operant Conditioning. The reinforcement contingency can be represented diagrammatically as follows: The arrow stands for "leads to" or "produces"
  • Learning
    eBook - ePub

    Learning

    A Behavioral, Cognitive, and Evolutionary Synthesis

  • With respect to reinforcers, what is an establishing operation and why is this concept important? How is this related to Premack’s research illustrating that the opportunity to engage in a behavior can serve as a positive reinforcer for other behaviors?
  • Describe the basic procedure of instrumental (operant) conditioning. Be able to describe the following variations in terms of the reinforcers (positive or negative) and the contingencies (positive or negative):
    1. positive reinforcement
    2. negative reinforcement
    3. omission training
    4. punishment by application
    What are the effects of these procedures on behavior?
  • How can we demonstrate that a behavior is due to Operant Conditioning and not Pavlovian conditioning? Give an example.
1 Hilgard and Marquis (1940) introduced the term instrumental conditioning to refer to those procedures where the learned behavior was “instrumental” in bringing about the occurrence of the consequence. Skinner (1937, 1938) introduced the term operant to designate behaviors that act or “operate” upon the environment to produce certain effects. At one time, the terms instrumental learning and instrumental conditioning were reserved for studies of learning in mazes and other apparatuses in which the subjects had to move from one location to another to obtain a reward or to escape aversive stimulation. The term Operant Conditioning was used for those procedures in which an individual performed some action in a specified location to obtain a reward or avoid aversive stimulation. Today, many people use these terms interchangeably. In this book, the term Operant Conditioning will be used to designate behavior-outcome procedures in general, but studies of learning in mazes will still be referred to as instrumental learning.
2 In the literature, outcomes are also termed reinforcers, consequences or rewards.
3
  • Learning and Behavior
    • James E. Mazur, Amy L. Odum(Authors)
    • 2023(Publication Date)
    • Routledge
      (Publisher)
    CHAPTER 5 Basic Principles of Operant Conditioning DOI: 10.4324/9781003215950-5 Learning Objectives After reading this chapter, you should be able to describe Thorndike’s Law of Effect and experiments on animals in the puzzle box discuss how the principle of reinforcement can account for superstitious behaviors describe the procedure of shaping and explain how it can be used in applied behavior analysis explain B. F. Skinner’s free-operant procedure, three-term contingency, and the basic principles of Operant Conditioning define instinctive drift and explain why some psychologists believed that it posed problems for the principle of reinforcement define autoshaping and discuss different theories about why it occurs. Unlike classically conditioned responses, many everyday behaviors are not elicited by a specific stimulus. Behaviors such as walking, talking, eating, drinking, working, and playing do not occur automatically in response to any particular stimulus. In the presence of a stimulus such as food, an animal might eat or it might not, depending on the time of day, the time since its last meal, the presence of other animals, and so on. Because it appears that the animal can choose whether to engage in behaviors of this type, people sometimes call them “voluntary” behaviors and contrast them with the “involuntary” behaviors that are part of unconditioned and conditioned reflexes. Some learning theorists state that whereas classical conditioning is limited to involuntary behaviors, Operant Conditioning influences our voluntary behaviors. The term voluntary may not be the best term to use because it is difficult to define in a precise, scientific way, but whatever we call nonreflexive behaviors, this chapter should make one thing clear: Just because there is no obvious stimulus preceding a behavior, this does not mean that the behavior is unpredictable
  • Behavior Analysis for School Psychologists
    • Michael I. Axelrod(Author)
    • 2017(Publication Date)
    • Routledge
      (Publisher)
    Vargas (2013), among others, described this relationship (i.e., antecedent-behavior-consequence [A-B-C]) as the three-term contingency. It can be used to understand behavior and is essential when designing intervention and instructional programs. Skinner (1953) introduced the world to Operant Conditioning via his work with animal models. Conducting experiments with pigeons and rats, Skinner determined that organisms emitted behavior that is then shaped by the environment through the environment’s delivery of consequences. In his classic experiments using the Skinner Box and rats, Skinner paired food, a consequential event, with the accidental hitting of a lever. Over time, the rats began hitting the lever to receive food. The food or the consequence for hitting the lever reinforced or strengthened the rats’ lever-hitting behavior. Essentially the rats ‘learned’ that hitting the lever produced food. In a second series of experiments, Skinner sent an aversive electrical shock through the cage. He rigged the device so that the electrical shock, another consequential event, would be turned off only when the rat hit the lever. Over time, the rats began hitting the lever to turn off the electrical shock. Turning off the electrical shock or the consequence for hitting the lever reinforced or strengthened the rats’ lever-hitting behavior. Again, the rats ‘learned’ that hitting the lever produced something good but instead of producing food, hitting the lever produced an escape from the aversive electrical shock. P OSITIVE AND N EGATIVE R EINFORCEMENT These experiments helped illustrate the two types of reinforcement. The first type, positive reinforcement, involves the organism receiving something (food) following engagement in a specific behavior (hitting the lever). Said differently, the food positively reinforces the lever-hitting behavior because it increases the rats’ future engagement in the lever-hitting behavior
  • Learning and Memory
    eBook - ePub

    Learning and Memory

    Basic Principles, Processes, and Procedures, Fifth Edition

    • W. Scott Terry(Author)
    • 2017(Publication Date)
    • Routledge
      (Publisher)
    F. Skinner began to develop techniques, terminology, and principles of learning by reinforcement. Skinner’s entire system of conditioning is called operant learning. Skinner first developed a small experimental chamber in which to condition animals such as rats or pigeons (Skinner, 1938). This “Operant Conditioning chamber” allows precise experimental control over the presentation of discriminative stimuli and reinforcers, and the recording of responses. (The term Skinner box is generally used today; see Figure 4.1.) A contingency is arranged between an operant response, for instance, pressing a handle or bar, and a reinforcer, usually a small round food pellet delivered through a chute. Skinner coined the label operant response to indicate that the subject’s response operates on the environment to produce a certain outcome. The bar-press-to-food contingency should lead to an increase in bar pressing, known technically as positive reinforcement (and informally as reward training). Once conditioned, operant responses can be extinguished. In extinction, the reinforcer is withheld, which should lead to a decrease in the frequency of responding. (Skinner [1956] later recounted how he came to invent the operant bar-press task. He was running rats in a straight alley, in which rats ran from the start end to the goal end for food reward. Skinner soon tired of retrieving them and wondered, why bother having the rat go somewhere to obtain the reinforcer? Why not let the rat stay in one place and do something else? And so the bar-press response was invented.) Figure 4.1 B. F. Skinner and a Rat in an Operant Conditioning Chamber (a.k.a. the “Skinner box”). Source: From A History of Modern Psychology, 6th ed. (p. 299), by D. P. Schultz and S. E. Schultz, 1996, Fort Worth, TX: Harcourt Brace. Reprinted courtesy of B. F
  • Human Behavior in the Social Environment
    eBook - ePub

    Human Behavior in the Social Environment

    Theories for Social Work Practice

    • Bruce A. Thyer, Catherine N. Dulmus, Karen M. Sowers(Authors)
    • 2012(Publication Date)
    • Wiley
      (Publisher)
    Chapter 3 Operant Learning Theory Stephen E. Wong
    What formative experiences in the social and physical world are overlooked while trying to explain human behavior primarily based on internal psychological and neurological processes?

    Historical and Conceptual Origins

    The earliest studies of operant learning can be traced back to the research of psychologist E. L. Thorndike with cats in puzzle boxes (Kimble, 1961). In Thorndike's experiments, hungry cats had to escape from boxes fastened shut in different ways to obtain food. Thorndike observed that after being placed in the boxes, the cats engaged in various behaviors such as pacing, visually exploring, and scratching at the walls. The animals performed these responses until they accidentally pressed the latch, pulled the string, or did something else that opened the box. On successive trials, the cats spent more time examining and scratching at the latch or the string, while the other responses gradually dropped out. Finally, the animal would perform the correct behavior as soon as it was placed in the box. Thorndike explained the learning of this new behavior with his “law of effect”: In situations where responses are followed by events that give satisfaction, those responses become associated with and are more likely to recur in that situation.
    B. F. Skinner, another American psychologist, greatly refined the experimental apparatus that permitted the study and conceptualization of operant learning. The “Skinner Box,” a chamber with a lever that could be programmed to deliver food following lever presses, provided several improvements over Thorndike's puzzle boxes. One advantage was that the relationship between lever presses and food delivery was arbitrary and could be readily manipulated by the experimenter. This allowed for the study of a wide range of variables, such as the ratio of responses to food deliveries, the time interval between responses that would produce food, and variations in stimuli that signaled the opportunity to earn food
  • Principles of Behavior
    • Richard W. Malott, Kelly T. Kohler(Authors)
    • 2021(Publication Date)
    • Routledge
      (Publisher)
    Part II

    Operant Conditioning

    Passage contains an image

    CHAPTER 2

    Operant Conditioning for Dummies (Part I)

    Behavior Analyst Certification Board 5th Edition Task List Items
    B-2. Define and provide examples of stimulus and stimulus class.* Throughout
    B-3. Define and provide examples of respondent and Operant Conditioning. Throughout
    B-4. Define and provide examples of positive and negative reinforcement contingencies. Pages 22–28
    B-6. Define and provide examples of positive and negative punishment contingencies. Pages 28–31
    B-9. Define and provide examples of operant extinction. Pages 32–33
    G-1. Use positive and negative reinforcement procedures to strengthen behavior. Pages 22–28
    G-15 Use extinction. Pages 32–33
    G-16 Use positive and negative punishment (e.g., time-out, response cost, overcorrection). Pages 28–31

    Back in the Day

    OK, so way back in the 1930s, like there’s this kid, just graduated from one of them ritzy eastern colleges, Hamilton College. And now he was setting off to become a famous writer, like an author, like a big deal. But he was pretty smart, and after a year of trying to become a famous author, he was smart enough to realize that wouldn’t happen.
    So he went to one of them ritzy eastern universities, Harvard University. And what did he do there? He put a rat in a box. You know, like Pavlov and his boys put a dog in a harness; well, this guy put a rat in a box. His name was Burrhus, the kid’s, not the rat’s. I don’t know what kind of parents would name their kid Burrhus, but this kid was smart enough to have his buddies call him by his middle name, Fred. Oh yes, and the rat’s name was Rudolph, Rudolph the Rat.
    So Fred put Rudolph in the box and also stuck a lever in the box. And every time Rudolph pressed the lever, Fred would give him a little pellet of food. And I’ll bet you can guess what happened next—the rat started pressing the lever about as fast as he could, until he was full; then he’d stroll over to the corner and take a nap. No big deal, right? But guess what happened to Fred.
  • Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.