Schedules of Reinforcement Worksheet ldentily the simple schedule of reinforcement being described in the following examples. a. Jessi is trying to get her little brother to wash his hands before he eats dinner. Every e he washes his hands, Jessi allows her brother to eat his desert before he eats dinner b. You usually get a text from your best friend about once every hour. c. You have to work 40 hours each week to get paid at your job. d. A new snack machine in the dorm sells candy for one dollar, but it only takes quarters. You get paid $20 per hour for stuffing envelopes, but only if you stuff at least 200 per hour. e. f. You have been teaching your dog to sit. You give her a treat on average every three times she sits on command. g. Your generous grandmother gives you a $100 bill on your birthday. h. Captain Murphy gets fed every six hours, but she has to meow after six hours to let her owners know that she is hungry. The apple tree in your yard sheds apples every time it rains, but it only rains about on a month. After it rains, you can gather and eat as many apples as you want. i.
The Correct Answer and Explanation is :
Let’s go through each example and identify the type of schedule of reinforcement. These can be broken down into simple schedules such as fixed ratio, fixed interval, variable ratio, and variable interval. Here’s how they apply to the examples provided:
a. Fixed Ratio (FR)
Explanation: Jessi reinforces her brother’s behavior by allowing him to eat dessert every time he washes his hands. This is a clear example of a Fixed Ratio schedule because reinforcement (getting dessert) occurs after a set number of responses (each time he washes his hands). It’s fixed, meaning the ratio doesn’t change.
b. Variable Interval (VI)
Explanation: Getting a text from your best friend about once every hour is a Variable Interval schedule. While the reinforcement (text) is relatively regular, it’s not set to a specific time; the text arrives on average every hour but can vary from hour to hour. This is typical of Variable Interval because the reinforcement comes at unpredictable times, based on an average.
c. Fixed Ratio (FR)
Explanation: The requirement of working 40 hours per week to get paid is a Fixed Ratio schedule. You get paid after completing a fixed number of hours worked (40 hours), which makes this a clear example of a fixed ratio of reinforcement.
d. Fixed Ratio (FR)
Explanation: This situation describes a Fixed Ratio schedule as well. You only get paid for stuffing envelopes if you meet the set requirement of stuffing at least 200 per hour. This is a fixed number (200), so it’s a Fixed Ratio reinforcement schedule. Similarly, the snack machine only accepts quarters, which is another fixed requirement for reinforcement.
e. Variable Ratio (VR)
Explanation: In this case, the reinforcement is given after an average of three sittings. This represents a Variable Ratio schedule because, on average, the reinforcement (treat) is provided after a certain number of responses (dog sitting), but the exact number may vary each time.
f. Fixed Interval (FI)
Explanation: Your grandmother gives you a $100 bill on your birthday. This is a Fixed Interval schedule because reinforcement occurs at a fixed time interval (your birthday). No matter what you do, the reward is given at that same time each year.
g. Fixed Interval (FI)
Explanation: Captain Murphy being fed every six hours for meowing at her owners is an example of a Fixed Interval schedule. The feeding happens after a set period of time (every six hours), making it a fixed interval reinforcement.
h. Variable Interval (VI)
Explanation: The apple tree shedding apples after it rains fits the Variable Interval schedule. Rain happens irregularly, so the apples are shed at unpredictable intervals (on average once a month), making it a Variable Interval schedule because the reinforcement (apples) comes after an unpredictable amount of time.
Summary
In the examples, the type of schedule of reinforcement is determined by how often the reinforcement is given (fixed or variable) and whether the number of responses or the time interval between reinforcements is fixed or variable. The basic schedules can be summarized as follows:
- Fixed Ratio (FR): Reinforcement is given after a fixed number of responses.
- Variable Ratio (VR): Reinforcement is given after an unpredictable number of responses.
- Fixed Interval (FI): Reinforcement is given after a fixed amount of time has passed.
- Variable Interval (VI): Reinforcement is given after an unpredictable amount of time has passed.
These schedules can affect how quickly or slowly behaviors are learned and maintained, with variable schedules often leading to stronger, more resistant behaviors.