B.f. Skinner

1407 Words3 Pages

B.F. Skinner Psychologist, born in Susquhanna, Pa. He studied at Harvard, teaching there (1931-6, 1947-74). A leading behaviorist, he is a proponent of operant conditioning, and the inventor of the Skinner box for facilitating experimental observations. B. F. Skinner’s entire system is based on operant conditioning. The organism is in the process of “operating” on the environment, which in ordinary terms means it is bouncing around the world, doing what it does. During this “operating,” the organism encounters a special kind of stimulus, called a reinforcing stimulus, or simply a reinforcer. This special stimulus has the effect of increasing the operant - which is the behavior occurring just before the reinforcer. This is operant conditioning: “the behavior is followed by a consequence, and the nature of the consequence modifies the organisms tendency to repeat the behavior in the future.” Say you have a dog and he’s just playing around with his toys and such and then when you throw a toy at him and he catches it then you give him a treat. Then all of the sudden the dog is starting to catch toys and such as you throw it in the air or at his mouth. The operant is the behavior just prior to the reinforcer, which is the treat. Then what if you decide to stop giving the dog treats, well he’ll stop his little trick which your, the owner were enjoying. This is called extinction of the operant behavior. Now, if you were to start showing the dog treats, then most likely he/she’ll want to start doing the tricks again and a little more quickly than the dog learned at first. This is because the return of the reinforcer takes place in the context of a reinforcement history that goes all the way back to the very first time the dog was reinforced for performing the tricks. Continuous reinforcement is the original scenario: Every time that the dog does the behavior (such as performing a trick), he gets a treat. The fixed ratio schedule was the first one Skinner discovered: If the dog did the trick three times, say, he gets a goodie. Or five times. Or twenty times. Or “x” times. There is a fixed ratio between behaviors and reinforcers: 3 to 1, 5 to 1, 20 to 1, etc. This is a little like “piece rate” in the clothing manufacturing industry: You get paid so much for so many shirts. Skinner also looked at variable schedules. Variable ratio means you change the “x” each ... ... middle of paper ... ...e aversive stimulus of hunger? Skinner (contrary to some stereotypes that have arisen about behaviorists) doesn’t “approve” of the use of aversive stimuli -- not because of ethics, but because they don’t work well! Notice that I said earlier that Johnny will maybe stop throwing his toys, and that I perhaps will take out the garbage? That’s because whatever was reinforcing the bad behaviors hasn’t been removed, as it would’ve been in the case of extinction. This hidden reinforcer has just been “covered up” with a conflicting aversive stimulus. So, sure, sometimes the child (or me) will behave -- but it still feels good to throw those toys. All Johnny needs to do is wait till you’re out of the room, or find a way to blame it on his brother, or in some way escape the consequences, and he’s back to his old ways. B. F. Skinner made numerous contributions to the science of behavior. He strongly influenced the area of learning that he named operant conditioning. His Skinner box is now a standard apparatus for the experimental study of animal behavior. Much of his work involved the study of how reinforcement schedules influence learning and behavior.

Open Document