B.9 Identify and distinguish among simple schedules of reinforcement.
Schedules of reinforcement in behavior analysis refer to the patterns or rules by which reinforcement is delivered following a behavior. There are several types of schedules of reinforcement, each with its own characteristics and effects on behavior. Let’s define and provide examples of some common schedules of reinforcement:
Continuous Reinforcement Schedule
Continuous reinforcement schedule involves reinforcing a behavior every time it occurs. Each instance of the target behavior is followed by reinforcement.
Example: Giving a child a piece of candy every time they finish their homework. Every completed homework assignment is immediately followed by a reward, reinforcing the behavior of completing homework.
Fixed Ratio (FR) Reinforcement Schedule
Fixed Ratio schedule delivers reinforcement after a fixed number of responses. The required number of responses remains constant.
Example: Paying a salesperson a commission for every fifth product they sell. After every five products sold, the salesperson receives a commission, reinforcing their selling behavior.
Variable Ratio (VR) Reinforcement Schedule
Variable Ratio schedule delivers reinforcement after an average number of responses, but the exact number varies. The variability in the number of responses required for reinforcement is what distinguishes it from the fixed ratio schedule.
Example: Playing a slot machine at a casino. Each pull of the lever has an unpredictable chance of resulting in a win. The reinforcement (winning) occurs on average every tenth pull, but the exact number of pulls required for a win can vary.
Fixed Interval (FI) Reinforcement Schedule
Fixed Interval schedule delivers reinforcement for the first response after a fixed amount of time has elapsed since the last reinforcement. The time interval remains constant.
Example: Checking your email and receiving new messages. If you check your email every 30 minutes, you may receive new messages only when that time interval has elapsed since the last check.
Variable Interval (VI) Reinforcement Schedule
Variable Interval schedule delivers reinforcement for the first response after an average amount of time has elapsed since the last reinforcement, but the exact time varies. The variability in the time interval distinguishes it from the fixed interval schedule.
Example: Checking social media for updates. Notifications or new posts appear at unpredictable times, with an average interval of 20 minutes, but the exact timing can vary.
These examples illustrate different schedules of reinforcement and their effects on behavior. The choice of schedule depends on the specific behavior being targeted and the desired outcome. Each schedule has unique properties that can influence the rate and pattern of behavior.