chapter 6 test

chapter 6 test - Name: Class: Date: ID: A Possible...

Info iconThis preview shows pages 1–18. Sign up to view the full content.

View Full Document Right Arrow Icon
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 2
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 4
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 6
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 8
Background image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 10
Background image of page 11

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 12
Background image of page 13

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 14
Background image of page 15

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 16
Background image of page 17

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 18
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Name: Class: Date: ID: A Possible Questions Ch 6_ Learning and Behavior Multiple Choice Identifi/ the letter of the choice that best completes the statement or answers the question. 1. In conditioning, behavior comes under the control of its consequences. a. operant b. classical c. instrumental d. both a and c 2. In m conditioning, it is what comes the behavior that is critical. a. operant; after b. classical; after c. instrumental; before d. both b and c 3. Goal—oriented is to automatic as W behavior is to __’ behavior. a. operant; elicited b. elicited; operant c. conditioned; unconditioned d. unconditioned; conditioned 4. In the textbook discussions of operant conditioning, the term "consequence" refers to a. how we feel about what is happening to us. b. the change in the probability of the behavior as a result of applying the reinforcer or punisher. c. the event that follows the behavior and is contingent upon it. d. the rate of behavior. 5. Antecedent is to conditioning as consequence is to conditioning. a. classical; operant b. operant; Pavlovian c. instrumental; respondent. d. both b and c 6. The first psychologist to systematically investigate the effect of consequences on the strength of a behavior was a. Skinner. b. Pavlov. c. Tolman. d. Thorndike. 7. Thorndike found that cats learned to escape from a puzzle box a. gradually. b. suddenly. c. with insight. d. both b and c 8. The original law of effect stated that behaviors leading to a(n) __ are a. satisfactory state of affairs; stamped in b. reinforcer; stamped in c. positive reinforcer; strengthened d. unconditioned stimulus; stamped out Name: ID: A 9. According to Thorndike's law ofeffect, behaviors leading to a(n) 777777 m state of affairs are stamped in, While behaviors leading to a(n) __ state of affairs are stamped out. at annoying; satisfactory b. satisfactory; annoying c. irregular; regular d. regular; irregular 10. Freud believed that humans are motivated to seek pleasure and avoid pain. This notion accords most closely with w__ definition of i a. Thorndike's; the law of effect b. Skinner's; operant conditioning c. Thorndike's; operant conditioning d. Skinner's; the law of effect 11. With his puzzle box experiments, Thorndike discovered that learning is usually a(n) process. a. sudden b. unpredictable c. stressful d. gradual 12. When first setting out to investigate the behavior of animals, Skinner had originally thought that all behavior could be explained in terms of a. thoughts and feelings. b. reflexes. c. operants. 01. fixed action patterns. 13. Skinner's development of the operant conditioning chamber was partly motivated by his desire to find a procedure that yielded w patterns of behavior. a. inflexible b. reflexive 0. regular d. irregular 14. In a standard Skinner box. a % earns food by m. a. rat; pressing a lever b. rat; running in a wheel c. pigeon; pressing a lever d. pigeon; flapping its Wings 15. Skinner's operant conditioning procedure is known as a free operant procedure because the rat a. is put on a free feeding schedule before the experiment starts. b. is free to enter and leave the chamber. 0. is free to move about the chamber. (:1. freely controls the rate at which it responds for food. 16. Skinner's operant conditioning procedure became known as a(n) _____ procedure because instrumental; the consequences are free free operant; the animal is free to control its response rate instrumental; the animal is free to enter or leave the chamber adjunctive; the experimenter is free to observe the rat's behavior 9.057.” Name: 17. 18. 19. 20. 21. 22. 23. 24. 25. In one variant of a Skinner box, a pigeon earns food by a. flapping its wings. b. turning circles. c. pecking a response key. (1. pressing a lever. Rat is to as pigeon is to __ a. lever press; key peck b. key peck; lever press c. turning circles; lever press d. lever press; turning circles Which of the following most Closely parallels what happens in a Skinner box? You are in your apartment with nothing to do except raid the refrigerator. a. b. You are at home watching television and raiding the refrigerator. 0. You are in a hospital with nothing to do. Meals are served at fixed times during the day. d. You are in a hospital with lots to do. Meals are served at fixed times during the day. Skinner divided behaviors into two categories: a. operant and instrumental. b. conditioned and unconditioned. c. primary and secondary. d. operant and respondent. Skinner's restatement of Thorndike‘s law of effect is a. less mentalistic. b. more mentalistic. c. less precise. (1. both a and c The basic components of the operant conditioning process include a. a response that produces a certain consequence. b. a consequence that strengthens or weakens the response. 0. a preceding stimulus that signals the availability of the consequence. d. all of the above The three components of the operant conditioning process include a. a response that is automatically elicited by a preceding stimulus. b. a consequence that strengthens or weakens the response. 0. a preceding stimulus that elicits the response. d. all of the above Properly speaking, operant behavior is said to be w by __H a. emitted; the organism b. elicited; the organism c. emitted; stimuli d. elicited; stimuli The operant response is properly described as a(n) a. emitted behavior. b. contrived behavior. 0. covert behavior. d. elicited behavior. ID: A Name: ID: A 26. Emitted is to elicited as ///////// M conditioning is to conditioning. a. classical; operant b. respondent; classical c. operant; classical d. instrumental; operant 27. The behavior of lever pressing for food is said to be a. elicited by the rat. b. emitted by the rat. 0. elicited by the food. (1. emitted by the food. 28. Operant behaviors are usually defined as a a. class of behaviors that are topographically similar. b. class of behaviors that lead to a certain consequence. c. specific behavior that leads to a certain consequence. d. specific behavior that leads to a class of consequences. 29. Behaviorists have found it useful to define operant behaviors as a a. specific response. b. covert stimulus. c. class of responses. d. unconditioned stimulus. 30. Which of the following conditions must be met for a response to be considered an operant? a. lts occurrence results in the delivery of a certain consequence. b. The consequence affects the future probability of the response. c. The response is elicited by the antecedent stimulus. d. a and b only 31. Properly speaking, when we give a dog a treat for sitting on command, we are attempting to reinforce a. the dog. b. the behavior. 0. the command. d. our relationship with the dog. 32. Properly speaking, when we praise a child for following instructions, we are attempting to reinforce a. the child. b. the instructions. c. the behavior of following instructions. d. both b and c 33. From an operant conditioning perspective, chocolate is a reinforcer if it a. strengthens the behavior that follows it. b. strengthens the behavior that precedes it. c. elicits salivation. (1. both b and c 34. Consequence is to process as __ is to ____ a. reinforcer; punisher b. reinforcement; punishment c. punisher; punishment d. reinforcement; reinforcer Name: ID: A as consequence is to refers to a process or procedure. Suppose a rat presses a lever and receives a food pellet. As a result, it is more likely to press the lever in the future. In this example, the food is functioning as a for lever pressing. c. decreases the probability of a behavior. Suppose a rat runs in a wheel and receives a food pellet. The subsequent increase in wheel running as a result Sam received a traffic fine for speeding the other day. The traffic fine is a for Sam's behavior of d. none of the above; further information is needed to determine the answer When Leena finished her homework, her mother gave her some apple pie. This is obviously an example of d. It is impossible to know given the information provided. 35. Procedure is to a. reinforcer; reinforcement b. reinforcement; punisher c. punishment; reinforcement d. reinforcer; punisher 36. The term a. reinforcer b. reinforcement c. punisher 61. both a and c 37. a. reinforcer b. discriminative stimulus c. punisher d. punishment 38. A spanking is a punisher if it a. follows a behavior. b. precedes a behavior. d. both a and o 39. of the food delivery is an example of a. an establishing operation. b. reinforcement. c. a reinforcer. d. punishment. 40. speeding. a. reinforcer b. punisher c. conditioned stimulus 41. a. positive reinforcement. b. negative reinforcement. 0. positive punishment. 42. A dog is given a treat each time it comes when called, and as a result no longer comes when called. The is an example of ___ treat; negative reinforcement treat; a punisher decrease in behavior; a punisher treat; punishment 999‘?” Name: ID: A 43. Maria gives her canary some food each time it flutters its wings. The food is a a. punisher. b. reinforcer. c. discriminative stimulus. d. none of the above; further information is needed to determine the answer 44. Properly speaking, reinforcers and punishers are defined entirely by a. their intensity. b. the probability of their occurrence. c. their effect on behavior. (1. the extent to which they are perceived as pleasant versus unpleasant. 45. Reinforcers and punishers are entirely defined by a. their hedonic value. b. the manner in which they influence behavior. c. the extent to which they are appetitive or aversive. d. both a and c 46. Reinforcers are the kinds of events that we consider pleasant. a. often but not always b. always c. rarely (1. never 47. If a mother kisses her child whenever he breaks a dish and, as a result, he breaks fewer dishes in the future, the kissing would by definition be a a. punisher. b. reinforcement. c. reinforcer. d. punishment. 48. An electric shock is a reinforcer if it a. follows a behavior. b. precedes a behavior. c. increases the probability of a behavior. (:1. both a and c 49. The Withdrawal of reinforcement for a behavior is called a. extinction. b. inhibition. 0. dishabituation. (1. negative punishment. . 50. The dog no longer receives food for begging and therefore stops begging. This is an example of a. blocking. b. punishment. 0. reinforcement. d. extinction. 51. An SD is a stimulus that a. increases the probability of a certain behavior. b. signals that a reinforcer is now available for the behavior. c. decreases the probability of a behavior. d. both a and b Name: ID: A 52. When Hai visits his parents, he whines a lot about how unappreciated he is at work. It seems likely that his parents are ,,,,,,,,,,,, m for whining. a. discriminative stimuli b. reinforcers c. reinforcement d. conditioned stimuli 53. A(n) __ is a stimulus that "sets the occasion for" a behavior. a. CS b. SD c. SR d. S" 54. A(n) stimulus serves as a signal that a response will be followed by a reinforcer. a. operant b. discriminative c. conditioned d. appetitive 55. A restaurant Sign can be viewed as a(n) “for entering the restaurant and getting a hamburger. a. SD b. US c. SR 01. CS 56. A simple way of thinking about the three-term contingency is that you (in correct order) a. notice something, get something, and do something. b. do something, notice something, and get something. 0. get something, notice something, and do something. d. notice something, do something, and get something. 57. In the three-term contingency, the antecedent is the a. reinforcer. b. operant response. c. discriminative stimulus. d. conditioned stimulus. 5 8. In correct order, the three-term contingency consists of a. antecedent, consequence, and behavior. b. antecedent, behavior, and consequence. c. consequence; behavior, and antecedent. d. behavior, antecedent, and consequence. 59. A stimulus which signals that a response will be punished is a(n) a. conditioned stimulus for punishment. b. unconditioned stimulus for punishment. 0. discriminative stimulus. cl. discriminative stimulus for punishment. 60. The statement, "Don't you dare try itl", would for most people be a(n) discriminative stimulus for reinforcement. unconditioned stimulus for fear. discriminative stimulus for punishment. discriminative stimulus for fear. P‘F’P‘?’ Name: 61. 62. 63. 64. 65. 66. 67. 68. 69. ID: A Unlike classically conditioned behavior, operant behavior is a. b. c. d. typically seen as voluntary and flexible. said to elicited by the stimulus. both a and b neither an nor b Unlike classical conditioning, operant conditioning involves a(n) a. b. c. d. S-S—R sequence. is a function of what comes before it. both a and b neither a nor b To determine if operant conditioning is involved, the most critical question to ask is whether the occurrence of the behavior is mostly a function of a. the stimulus that precedes it. b. the stimulus that follows it. c. the person. d. the environment. A contingency of reinforcement means that a. a response is followed by a reinforcer. b. a reinforcer is followed by a response. c. a response is elicited by a reinforcer. d. a response is elicited by an SD. When combined with the terms reinforcement or punishment, the word positive means a. something that is appetitive. b. something that is subtle. c. something is added or presented. d. both a and b When combined with the terms reinforcement or punishment, the word negative means a. something that is good. b. something that is intense. c. something that is unpleasant. d. something is subtracted or withdrawn. With respect to the four types of contingencies, add is to subtract as _ is to __ a. desire; hate b. positive; negative c. negative; positive (1. hate; desire Increase is to decrease as is to _m a. reinforcement; punishment b. punishment; reinforcement c. antecedent; consequence d. consequence; antecedent The term refers to the presentation of a stimulus following a response which then leads to an increase in the future strength of that response. a. positive reinforcement b. negative reinforcement c. positive punishment d. negative punishment Name: ID: A 70. The term positive reinforcement refers to the MMMMMM m ofa stimulus following a response which then leads to a n) in the future strength of that response. a. removal; increase b. presentation; decrease c. presentation; increase d. removal; decrease 71. The pigeon pecks a response key and receives food. As a result, the probability of key pecking increases. This is an example of a. positive reinforcement. b. negative reinforcement. 0. negative punishment. d. positive punishment. 72. Andre praises his young daughter for being assertive, after which she becomes even more assertive. This is an example of a. negative reinforcement. b. positive reinforcement. c. negative punishment. (1. positive punishment. 73. John yells at his dog whenever it barks. As a result, the dog begins barking even more frequently. This is an example of a. positive punishment. b. negative reinforcement. c. negative punishment. (1. positive reinforcement. 74. Paula spanks her child when he breaks a dish. As result, he breaks dishes even more frequently. This is an example of a. negative punishment. b. negative reinforcement. c. positive reinforcement. d. positive punishment. 75. The term refers to the removal of a stimulus following a response which then leads to an increase in the future strength of that response. a. positive reinforcement b. negative reinforcement c. positive punishment (1. negative punishment 76. The term negative reinforcement refers to the of a stimulus following a response which then leads to a(n) in the future strength of that response. a. removal; increase b. presentation; decrease c. presentation; increase d. removal; decrease Name: ID: A 77. When I banged on the heating pipe. it stopped making a noise. The next time I heard that noise, I immediately banged on the pipe. This seems to be an example of a. positive reinforcement. b. negative reinforcement. C. positive punishment. d. negative punishment. 78. Jason's mother tells him: "If you clean your room, you won't have to do the dishes." What type of contingency is she attempting to apply? a. positive reinforcement b. negative reinforcement c. positive punishment d. negative punishment 79. "I'll do anything to avoid housework." This statement speaks to the power of a. positive reinforcement. b. negative reinforcement. c. positive punishment. d. negative punishment. __ 80. A(n) response occurs before the aversive stimulus is presented and thereby prevents its delivery. a. escape b. avoidance c. reflexive d. primary 81. A(n) __ response results in the termination of an aversive stimulus. a. escape b. avoidance c. reflexive d. primary 82. Putting on a heavy parka before going out into the cold is an example of a(n) response, while putting it on after you go outside and become cold is an example of a(n) response. a. operant; reflexive b. avoidance; escape c. escape; avoidance d. reflexive; operant 83. The term refers to the presentation of a stimulus following a response which then leads to an decrease in the future strength of that response. a. positive reinforcement b. negative reinforcement 0. positive punishment (1. negative punishment 84. The term positive punishment refers to the of a stimulus following a response which then leads to a(n) in the future strength of that response. a. removal; increase b. presentation; decrease c. presentation; increase d. removal; decrease l0 Name: ID: A 85. Jim compliments his secretary on her sexy new outfit when she offers to bring him coffee one morning. She never again offers to bring him coffee. Out ofthe following, this is an example ofwhich type ofprocess? a. positive reinforcement b. negative reinforcement c. positive punishment d. negative punishment 86. When Pedro punched his sister, she punched him back. He never again punched her. This seems to be an example of what process? a. positive punishment b. negative reinforcement 0. positive reinforcement d. negative punishment 87. When Pedro teased his sister, she hugged him. He never again teased her. This seems to be an example of what process? a. positive reinforcement b. negative reinforcement c. positive punishment d. negative punishment 88. A stimulus that can serve as a negative reinforcer can probably also serve as a a. negative punisher. b. positive punisher. 0. positive reinforcer. d. unconditioned reinforcer. 89. Most people would be least likely to volunteer for an experiment on a. positive reinforcement. b. appetitive conditioning. 0. positive punishment. (1. negative punishment. 90. The term refers to the removal of a stimulus following a response which then leads to an decrease in the future strength of that response a. positive reinforcement b. negative reinforcement 0. positive punishment d. negative punishment 91. The term negative punishment refers to the of a stimulus following a response which then leads to a(n) in the future strength of that response. a. removal; increase b. removal; decrease c. presentation; increase d. presentation; decrease 92. Melissa stayed out past her curfew and subsequently lost car privileges for a week. As a result, she never again stayed out past her curfew. This example best illustrates the process of a. positive reinforcement. negative reinforcement. positive punishment. negative punishment. 9.0.5" ll Name: ID: A 94. 95. 96. 97. 98. 99. 100. Felix swore at his girlfriend during an argument one day, after which she wouldn't talk to him for a week. As a result, he became much less likely to swear at her. This is best described as an example of a. positive reinforcement. b. negative reinforcement. c. positive punishment. d. negative punishment. The use of punishment can be quite seductive in that its delivery is often followed by w for the person who delivered it. a. immediate positive reinforcement b. immediate negative reinforcement c. delayed positive punishment (1. delayed negative punishment When Sean doesn't cry, he doesn't get an extra helping of dessert. As a result, he always cries at the dinner table. This is best interpreted as an example of a. positive reinforcement. b. negative punishment. c. positive punishment. d. extinction. "I don't think about an upcoming exam so that I won't get anxious." This pattern of behavior probably evolved as a function of what process? a. positive reinforcement b. extinction c. positive punishment (1. negative punishment Jason's mother tells him: "If you don't clean your room, you won't get to watch television." This can be classified as what type of contingency? a. positive reinforcement b. extinction c. positive punishment d. negative reinforcement the reinforcer; the stronger its effect on behavior. In general, the more a. immediate b. delayed c. negative d. positive It is difficult to eat a healthy diet because the reinforcers for healthy eating are often _~_, while the reinforcers for eating junk food are ___w a. extrinsic; intrinsic b. intrinsic; extrinsic c. immediate; delayed d. delayed; immediate It is difficult to study on the weekend because the reinforcers for studying are often ___, while the reinforcers for having fun are a. immediate; delayed b. delayed; immediate 0. primary; secondary d. secondary; primary l2 102. 103. 104. 105. 106. 107. 108. ID: A A(n) / reinforcer is one that has become a reint‘orcer because it is associated with some other reinforcer. a. primary b. unconditioned c. secondary (1. both a and b A primary reinforcer is one that a. is innately reinforcing. b. has become associated with another reinforcer. c. has become associated with many other reinforcers. d. is immediate rather than delayed. Innate is to learned as M reinforcer is to W reinforcer. a. secondary; primary b. primary; secondary c. intrinsic; extrinsic d. extrinsic; intrinsic Events that are innately reinforcing are called a. extrinsic reinforcers. b. primary reinforcers. c. secondary reinforcers. d. generalized reinforcers. A primary reinforcer is also called a(n) ____ reinforcer, while a secondary reinforcer is also called a(n) pppppppppppppppppppp 0 reinforcer. a. conditioned; unconditioned b. generalize; nongeneralized c. nongeneralized; generalized d. unconditioned; conditioned A secondary reinforcer can also be called a(n) reinforcer. a. conditioned b. unconditioned c. generalized unconditioned d. intrinsic A primary reinforcer can also be called a(n) # reinforcer. a. conditioned b. unconditioned c. generalized conditioned d. intrinsic Food usually functions as a(n) # reinforcer while a light that has been paired with food functions as a(n) reinforcer. a. generalized; discriminative b. extrinsic; intrinsic 0. primary; secondary d. secondary; primary l3 Name: ID: A 109. The rat's home cage is strongly associated with food, water, warmth, and safety. As a result, the opportunity to enter the home cage can likely function as a(n) _W reinforcer. a. primary b. preconditioned c. generalized d. unconditioned 110. For a professional athlete, admiration from fans is best described as a(n) a. generalized reinforcer. b. conditioned reinforcer. 0. primary reinforcer. d. artificial reinforcer. 111. Money and social attention are common examples of reinforcers. a. primary b. secondary c. unconditioned d. generalized 112. Behaviors that have been strongly associated with reinforcement can themselves become reinforcers. a. primary b. secondary c. intrinsic d. discriminated 113. According to the theory of , hard work can sometimes function as a secondary reinforcer. a. learned helplessness b. acquired industriousness c. learned industriousness d. learned acquisitiveness 114. Research has shown that rats that have been reinforced for emitting forceful lever presses will subsequently a. run faster down an alleyway to obtain food. b. run more slowly down an alleyway to obtain food c. show a generalized tendency to be lazy. d. both b and c 115. Research has shown that students who have been reinforced for solving complex math problems will subsequently a. write essays of lower quality. b. write essays of higher quality. 0. show a decrease in math ability. d. both a and c 116. Jack works extremely hard at whatever task he is assigned. According to learned industriousness theory, working hard likely serves as a for Jack. a. conditioned reinforcer b. secondary reinforcer c. primary reinforcer d. both a and b l4 Name: ID: A 1 17. Behaviors performed for their own sake are said to be a. intrinsically motivated. b. extrinsically motivated. c. extrinsically reinforced. d. innately motivated. 118. Behaviors that are motivated by some added incentive are said to be __u motivated. a. intrinsically b. extrinsically c. hedonically d. extraneously 119. Extrinsic rewards are likely to lower intrinsic interest in a task when they are a. expected. b. verbal. c. delivered contingent upon high quality performance. (1. all of the above 120. Extrinsic rewards are likely to raise intrinsic interest when they are a. nonverbal. b. tangible. c. given for high quality performance. (1. both a and b 121. Each time Jana learns a new piece on the piano and can play it without error, she gets to have her favorite dessert for dinner. Chances are that Jana‘s interest in playing the piano will likely a. decrease. b. increase. c. remain unchanged. d. either a or c 122. Extrinsic rewards are less likely to damage intrinsic interest when they are a. expected. b. verbal. c. delivered contingent upon mere performance of the activity. d. all of the above 123. Jackie praises her daughter each time she does a fine job on her math homework. As a result, her daughter is likely to become a. more interested in math. b. less interested in math. 0. focused upon receiving praise from her mother. d. both b and c 124. Wendy promises to give her son a cookie for each hour that he studies math. As a result, her son could well become a. more interested in math. b. less interested in math. c. less interested in cookies. d. resistant to her instructions. 15 ID: A 126. 127. 128. 129. 130. 131. 132. Suzie notices that her daughter Nina loves to play piano. Suzie decides to encourage her further by promising to pay her a dollar for every extra hour of piano practice in the evening. Chances are that Nina's intrinsic interest in playing the piano will likely a. decrease. b. increase. c. remain unchanged. d. both b and c A reinforcer that has been deliberately arranged to modify a behavior and is not a common aspect of a certain situation is called a(n) reinforcer. a. natural b. artificial c. contrived d both b and c A(n) reinforcer is one that has been deliberately arranged to modify a behavior and is not a natural- aspect of a certain situation. a. artificial b. extrinsic c. intrinsic d. both a and c Intrinsic reinforcers a. are always natural reinforcers. b. are always artificial reinforcers. c. can be either natural or artificial reinforcers. d. can be neither artificial nor natural reinforcers. Money a. is always an artificial reinforcer. b. can be either a natural or artificial reinforcer. c. can be neither a natural nor artificial reinforcer. d. is always an intrinsic reinforcer. reinforcer for going to a theatre. Seeing a movie is a(n) a. artificial b. natural c. primary d. negative Being paid to study is a(n) reinforcer for studying. a. natural b. extrinsic c. artificial d. both b and c When contrived reinforcers are used in a clinical setting, a. it is important to consistently maintain them within that setting. b. the attempt will often be made to withdraw them over time. c. it is hoped that the behavior will eventually become trapped in the natural contingencies in that environment. (:1. both b and c 16 134. 135. 136. 137. 138. 140. ID: A Enjoying yourself at a party is a(n) ‘ kkkkkkkkk » reinforcer for going to the party. a. contrived b. natural c. extrinsic d. both a and b The process of reinforcing gradual approximations to a new behavior is known as a. chaining. b. shaping. c. graduated reinforcement. d. fading. Shaping is the a. reinforcement of new operant behavior. b. gradual reinforcement of new operant behavior. c. reinforcement of gradual approximations to a new behavior. d. creation of new behavior through gradual reinforcement. Which of the following is an example of shaping? a. Reinforcing the behavior of lever pressing. b. Reinforcing gradual approximations to lever pressing. C. Gradual reinforcement for lever pressing. d. Reinforcing the rat for gradual approximations to lever pressing. The advantages of using a click or whistle as a secondary reinforcer during shaping include a. it can be delivered immediately following the correct behavior. b. the animal will not satiate upon it. c. both a and b d. neither a nor b The sound of a click can be an effective tool for shaping after it has been paired with a a. food; secondary reinforcer b. shock; primary punisher c. food; primary reinforcer d. shock; secondary punisher At the 200 one day, you notice a zookeeper coaxing a camel into a pen by blowing a whistle. It is probably the case that the whistle has been paired with , and is now functioning as a(n) W. a. shock; punisher b. food; unconditioned stimulus c. food; secondary reinforcer d. shock; primary punisher Overtime, we are likely to become more and more efficient at washing the dishes. This is mostly the result of a. primary punishment. b. positive punishment. c. chaining. d. shaping. , thereby making it 17 ID: A l4l. Overtime, Jim gradually becomes more and more efficient at Cleaning his apartment. This improvement is most likely an example of what type ofprocess‘? a. intermittent reinforcement b. stimulus control c. shaping d. an FR schedule of reinforcement 142. For a male betta splenden, the sight of another male can act as a a. releasing stimulus. b. positive reinforcer. (1. both a and b d. neither a nor b 143. For a male betta splenden, potential reinforcers include a. the sight of another male. b. food. c a cold stream of water (1. both a and b 18 ...
View Full Document

This test prep was uploaded on 04/03/2008 for the course PSY 320 taught by Professor S.quinn during the Spring '08 term at Salve Regina.

Page1 / 18

chapter 6 test - Name: Class: Date: ID: A Possible...

This preview shows document pages 1 - 18. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online