B.F. Skinner - Operant or Instrumental Conditioning - Contingency Management.
"Radical Behaviorism" -- The attempt to create a psychology composed entirely of relationships between objectively observable stimuli and objectively observable responses. Note that some of the principles described below involve "covert" components --responses within the organism that cannot be observed from outside. Where that's the case, this summary departs from radical behaviorism. For Skinner, the organism is a "black box" and he claimed not to be interested in what occurs inside it.
Skinner as a psychologist vs. as a philosopher. My own appraisal is that he may well have been one of the three most influential psychologists of the 20th Century (along with Freud and Rogers). His work was careful and largely impeccable. His results have withstood the tests of time and his methods have been widely applied. He and Rogers were characterized as in direct opposition, and debated each other several times, but they agreed completely in their desire to move toward less punitive interpersonal behavior and social arrangements.
In my view, however, he was about as bad a philosopher as he was brilliant a psychologist, as articulated most especially in his book Beyond Freedom and Dignity. He envisioned a society based on positive reinforcement. He never really dealt, however, with the questions of "Who controls the controllers?" or took up issues such as control in the service of unquestioning conformity, etc. As an experimental psychologist, however, he was remarkable. His methods have proven useful in animal training, with populations seldom reached by other means (developmentally disabled, autistic, etc.), and by many "normal" people wanting to behave more effectively in their own lives.
An Operational Definition: Defines a behavior in terms of the actions that must be performed in order to produce or observe it. An adequate operational definition allows different people in different times or places to produce a given behavior and know that it's the same behavior produced or observed by the person who gave them the operational definition of it.
Independent, intervening, and dependent variables. We met these with Tolman. Skinner was not interested in intervening variables, but just in independent and dependent ones.
Operant (instrumental) Conditioning: The modification of response strength by manipulation of the consequences of the response. Responses that are followed by a reinforcer gain in strength; responses not followed by a reinforcer become weaker. The organism operates in the environment to produce change that will lead to reward. A form of learning in which the consequences of the behavior lead to changes in the probability of its occurrence.
A Single Subject Design. Instead of running a lot of rats, pigeons, or people through a procedure and looking for statistically significant small differences, Skinner looked for manipulations so powerful that they would produce a large and observable change in behavior. To confirm their effect, he often used a reversal design: that is, take a baseline, then introduce the manipulation, look at the change in frequency of the behavior, the remove the manipulation, see if the behavior returns to close to its baseline level, then reintroduce he manipulation to see if the change occurs again.
A frequency graph: Put days along the bottom, number of instances up the side, and draw a line for each day to show how many times the observed behavior occurred that day.
A time-and-event grid. Here you divide each day into times (such as hours) and record how many times the act you want to strengthen or weaken occurs in each square of the grid. This may help you see patterns that tell you what cue sets off the behavior, or what reinforcing stimuli strengthen or weaken it.
A-B-C- Refers to Antecedent, Behavior, and Consequence. It is the chain of action in the behavioral model. Observing antecedents and consequences helps you figure out how to set up an effective program for change.
Positive Reinforcement: Any event the occurrence of which increases the probability of a response. ) (Examples: food, drink, sex, affirmation, attention, appreciation, encouragement.)
Negative Reinforcement: What occurs when the removal of a (negative) stimulus leads to a greater probability that the response bringing about this removal will occur again.
Escape Conditioning: A given response results in getting away from an undesired stimulus.
Avoidance Conditioning: A given response keeps an undesired event from occurring.
Punishment- Either removing a positive reinforcer or presenting a negative reinforcer. Skinner views punishment as an unreliable way of preventing responses from reoccurring.
Stimulus Control- An approach that's especially appropriate when we want an already existing behavior to occur more often or less often, or to occur under some circumstances but not under others. (e.g.,Formal clothes serve as a discriminative stimulus for myself and others for more formal behavior)
Discriminative Stimulus- A stimulus to which an organism learns to respond as part of stimulus control training. It signals the likelihood that a desired or undesired consequence will follow a response.
Primary Reinforcer- A reward that satisfies a biological need ( e.g., hunger; thirst).
Secondarv Reinforcer- A stimulus that becomes reinforcing by its association with a primary reinforcer, (e.g., money, which allows us to obtain food, a primary reinforcer). Through a process of association and conditioning, we now respond to them much as we once responded to the primary reinforcers with which they were paired.
Reinforcers for Specific People- Don't assume that because something is a reinforcer for you, it is a -reinforcer for someone else, too. To identify an effective reinforcer, ask the person what they like or observe what the person values doing. (e.g., when a person has just eaten, he or she is probably "satiated," so food won't be much of a reinforcer).
Menus: A list of rewards that a token can be exchanged for. A "token" is a generalized secondary reinforcer. It can be a poker chip, dots or stars on a chart, or what-have-you.
Immediate vs. Delayed Reinforcement- Immediate reinforcement that occurs immediately after desired or undesired behavior occurs. This type of reinforcement has the strongest and quickest effect in controlling behavior. The longer the delay, the less likely the learning.
Intrinsic Reinforcement- Pursuing a goal or action without interest in external or material reward. With intrinsic motivation, one is doing something because they want to. It satisfies the inner self.
Extrinsic Reinforcement- Behaving in a way which the only goal is the material or external reward. No personal or inner satisfaction is pursued. Behavior maintained by intrinsic reinforcement tends to be more resistant to extinction.
The Premack Principle- Given two responses that differ in the probability of occurrence, the less probable can be reinforced by using the -more probable as a reward. In other words, the chance to do something that you often choose to do can be used to reinforce something that you seldom choose to do.Hats off to David Premack for identifying this contingency.
Covert Reinforcement-Things one says or does to oneself to make oneself feel good or bad. These usually cannot be observed by another person unless they are articulated out loud or include very obvious body language. Covert reinforcers may be positive or negative.
Covert stimuli--events inside the organism that trigger a given behavior.
Chaining- The situation in which one response brings the organism into contact with stimuli that:
(a) Continuous Reinforcement Schedule. The reinforcing of a behavior every time it occurs. (e.g., getting a soda out of a machine every time you put money in it)
(b) Intermittent Reinforcement Schedule
When we reinforce a behavior only sometimes instead of every time it
(a) Fixed Interval Schedule
A schedule whereby a reinforcer is given at established time intervals.
(e.g., a weekly paycheck)
(b) Variable Interval Schedule
A schedule whereby reinforcement is given at various times, usually
causing a behavior to be maintained more consistently.
(e) Fixed Ratio Schedule
A schedule whereby reinforcement is given only after a certain number of
responses is made. Example: payment by piecework.
( f) Variable Ratio Schedule
A schedule whereby reinforcement occurs after a varying number of
responses rather than after a fixed number. Example: slot machine.
(g) Combined Schedule
Differential reinforcement of zero response rate. (omission training)
(hh) Sometimes DRO is read to mean differential reinforcement of "other"--that is, any behavior other than one that we want to get rid of. So if Johnny is engaging in physical violence toward other children, any other response might be reinforced.
Differential reinforcement of a low rate of responding level. (fixed interval
Differential reinforcement of high rate of responding. The basic principle
if a person is only doing something occasionally.
"Start'Where the Behavior is At"- There are some things a person can't learn until certain prior developmental steps have been taken. Don't start where you think the behavior should be. (e.g., You can't toilet train a child until the sphincter muscle can be controlled)
A large reward is more likely to get someone to act as desired, except in the cognitive dissonance situation, where a reward just barely big enough to get the person to act as desired may lead to greater resistance to extinction than a large reward, due to self-justification.
Narrowing. A form of stimulus control in which we reduce the number of different cues that tngger our behavior. Bringing a behavior under control of a more narrow range of stimuli.
"Shaping" or successive approximation. When we are trying to learn something that's hard to do expertly right away, it helps to receive awards for close approximations to the goal. This teaching process is called "successive approximation" or"shaping'. Shaping involves the gradual change of a response while the stimulus stays about the same. At any point, the learner only needs to do what he or she is capable of doing now. Short-term goals are important.
Fading- Fading involves the gradual change of a stimulus while the response stays about the same. It starts with a strong stimulus and gradually fades out.(e.g., Teaching a dog to sit; start with a loud command and pushing him down. Gradually fade out loud command and only require a hand signal.
(Shaping involves the gradual change of a response while the stunulus stays about the same; fading involves the gradual change of a stunulus while the response stays about the same.}
Extinction and Forgetting- Extinction is the weakening and eventual disappearance of a learned response. The response no longer brings the rewards it used to bring; the response tends to occur less often. Forgetting is not permanent. The response can be remembered at any time.
Strategic Suppression- How is punishment sometimes used in conjunction with positive reinforcement? Punishment may be needed to help stop an unwanted behavior. (e.g., If a child runs in the street, punishrnent will immediately teach him not to do that behavior; but with the punishment, appreciation for staying on the curb should be Oiven (Positive reinforcement). Thus punishment can serve to suppress one way of acting while a new way is learned.
Strengthening an incompatible response. If a new response (cognitive, emotional, or physical) can be taught that cannot occur at the same time as the old response, then it makes it harder for the old response to occur.
Time-Outs- Similar, but not identical, to the old practice of sending someone to sit in the corner. It is basically a form of withdrawing positive reinforcement: (e.g., the child is banished to a place where there is nothing interesting to do). With my own children, I found that it was often more effective to give myself a time out when they were getting into a fight ("I'm stepping outside to meditate for a few minutes), thus depriving them of their audience.
Establish functional behavior: Teach new behavior that works in dealing with the world--that is, that will elicit reinforcement in daily life, rather than behavior that will not get rewarded outside the learnings situation and will thereby extinguish.
Be consistent. In behavioral training, there is no place for inconsistency, except with an intentional intermittent schedule--and event that should be consistent.
Think small. What is the smallest change that will be satisfactory? The more precise and specific the change you try to bring about, the more likely you will be to be successful. The bigger and grander, the scheme, the less likely.
Contracts for change. If you want to affect another person's behavior, it's usually a good idea to describe them what you want and what you propose to do. See if they have any suggestions for modifying your proposed program. Also see what they want from you in return, so that it's not a one-way street and they have some investment in it too.