If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content
Current time:0:00Total duration:7:49

Operant conditioning: Schedules of reinforcement

Video transcript

many of our behaviors are on a partial reinforcement schedule partial reinforcement refers to a situation in which a behavior is reinforced only some of the time these partial schedules of reinforcement are important because they are generally more resistant to extinction than continuous reinforcement as we discussed behaviors are shaped through a process of continuous reinforcement of successive approximations of the target behavior however continuous reinforcement eventually becomes less reinforcing so there's a need for these partial schedules of reinforcement which vary in their ability to maintain learned behaviors and these were actually discovered by BF Skinner through observation of reward schedules with animals however they apply to humans too so there are four schedules of partial reinforcement and each one has a different effect on controlling and maintaining behaviors as you're watching this video you'll probably think of situations in your life where your behavior was reinforced on each of these schedules and by the end of the video you'll be able to label those situations with the terminology used in operant conditioning so here you can see the four schedules of partial reinforcement if these terms are new to you don't worry they'll start to make a lot more sense when you break them apart so for our purposes I want you to associate the word ratio with amount of responses the word ratio looks similar to ration and a ration of food for example is a certain amount of food so ratio means amount and for our purposes an amount of responses and when you see the word interval I want you to associate the word interval with time think of that phrase an interval of time like maybe a long interval of time passed or we're only given a short interval of time to answer the question so interval means time so use ratio which means amount of responses and here's interval which means time now each of these categories cannot be fixed or variable meaning that they can be fixed as inconsistent or variable as in there is a variation and if you combine these words together you come up with the four schedules of partial reinforcement fixed ratio fixed interval variable ratio variable interval so let's talk about a fixed ratio schedule so pretend as car salesman gets a bonus for every 5 cars he sells that bonus is a reinforcer placed on a fixed ratio schedule on a fixed ratio schedule reinforcement only occurs after a fixed number of responses so this car salesman has to sell 5 cars in order to get a bonus and if he sells 5 cars in a week he'll get a bonus if he sells 5 cars in a month he'll get a bonus and if you sells 5 cars in a day he'll get a bonus you get the idea what I'm illustrating here is that the reinforcement in this case the bonus is contingent on the number of cars he sells regardless of how long it takes them to do it so since the only barrier between the car salesman and this bonus is a number of cars he sells you might imagine he'll work at a furious pace to earn as many bonuses as possible jobs that demand you to work at a fast paced manner like this they often pay their employees on a fixed ratio schedule think of like factory workers and fruit pickers for instance that's the benefit of a fixed ratio schedule tends to emit a high rate of behavior because the frequency of getting the reward or the reinforcer pretty much depends on the person so that's a fixed ratio schedule now let's talk about a fixed interval schedule so pretend this car salesman receives a paycheck every two weeks as long as he sells one car the paycheck is on a fixed interval schedule because the reinforcement occurs after a consistent amount of time has passed in this case two weeks so in this case his paycheck doesn't change if he sells one car or hundred cars during that time interval so his paycheck is depend on the amount of time that passes so as you might imagine he probably doesn't have much of an incentive to sell more than one car if you'll make the same amount of money anyway and that's the classic rate of responding for fixed interval schedules it's much slower than say the fixed ratio schedule like we discussed earlier now here we had the variable ratio schedule a variable ratio schedule means that the rain sorcerer is delivered after an average number of correct responses has occurred so a variable ratio schedule is similar to a fixed ratio schedule except the number of responses needed to receive the reinforcement changes after each reinforcer is presented so put simply a variable ratio schedule is literally a series of fixed ratio schedules that just change what matters in the end is the average number of correct responses so using an example used for the fixed ratio schedule there's a car salesman receiving a bonus for every five cars he sells if the number needed to receive a bonus was always fixed at five then that would be a fixed ratio schedule but a variable ratio schedule would vary so maybe he must sell like five cars to get the first bonus and then three cars to get the second bonus and then seven cars to get the third bonus and then six cars to get the fourth one and then four cars to get the fifth bonus if you add up all the cars sold and divide it by the five bonuses he received you'd find out that the average number of cars sold to receive a bonus is five which is what the fixed ratio schedule was above the five cars per bonus the difference here was that the variable ratio schedule has a lot of uncertainty the car salesman cannot predict when he'll receive a bonus in this case but with every car sold he comes closer to getting that bonus so the classic example use when it comes to a variable ratio schedule is a slot machine if you were played a slot machine you understand the power of variable ratio schedule a slot machine is programmed to pay out after an average number of polls but since you never really know when the payout will occur you keep playing and hoping that you'll win something and that's one reason it's so hard to walk away from a slot machine you always wonder like what would happen if the next poll is the jackpot you don't want to miss that obviously so a slot machine is a variable ratio schedule because the reinforcement is dependent on your behavior that is you have to bet money and pull a lever in order to have a chance to receive anything and doesn't matter how long you wait in between pulling that lever it simply matters whether you formed the behavior of pulling the lever so that is a variable ratio schedule a variable interval schedule means that the responses are reinforced after a variable amount of time has passed so to use the car salesman again imagine a supervisor randomly showing up without notice to give him a bonus as long as the supervisor sees him actively talking to a customer he'll give him that bonus now since the car salesman never knows when a supervisor will drop by he has to consistently engage with a customer in order to increase the chance that his supervisor will notice the difference here is that he could have sold one car or even 100 cars on average that month but all that matters in order to receive the bonus is whether or not he's actively engaging in a sale when a supervisor happens to come by so a variable interval schedule like this results in a more regular rate of responding than does a fixed interval schedule so those are the four types of partial reinforcement schedules as you might imagine each one has a tendency to emit a different pattern of response