{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

chapter 6 test - Name Class Date ID A Possible Questions Ch...

Info iconThis preview shows pages 1–18. Sign up to view the full content.

View Full Document Right Arrow Icon
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 2
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 4
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 6
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 8
Background image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 10
Background image of page 11

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 12
Background image of page 13

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 14
Background image of page 15

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 16
Background image of page 17

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 18
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Name: Class: Date: ID: A Possible Questions Ch 6_ Learning and Behavior Multiple Choice Identifi/ the letter of the choice that best completes the statement or answers the question. 1. In conditioning, behavior comes under the control of its consequences. a. operant b. classical c. instrumental d. both a and c 2. In m conditioning, it is what comes the behavior that is critical. a. operant; after b. classical; after c. instrumental; before d. both b and c 3. Goal—oriented is to automatic as W behavior is to __’ behavior. a. operant; elicited b. elicited; operant c. conditioned; unconditioned d. unconditioned; conditioned 4. In the textbook discussions of operant conditioning, the term "consequence" refers to a. how we feel about what is happening to us. b. the change in the probability of the behavior as a result of applying the reinforcer or punisher. c. the event that follows the behavior and is contingent upon it. d. the rate of behavior. 5. Antecedent is to conditioning as consequence is to conditioning. a. classical; operant b. operant; Pavlovian c. instrumental; respondent. d. both b and c 6. The first psychologist to systematically investigate the effect of consequences on the strength of a behavior was a. Skinner. b. Pavlov. c. Tolman. d. Thorndike. 7. Thorndike found that cats learned to escape from a puzzle box a. gradually. b. suddenly. c. with insight. d. both b and c 8. The original law of effect stated that behaviors leading to a(n) __ are a. satisfactory state of affairs; stamped in b. reinforcer; stamped in c. positive reinforcer; strengthened d. unconditioned stimulus; stamped out Name: ID: A 9. According to Thorndike's law ofeffect, behaviors leading to a(n) 777777 m state of affairs are stamped in, While behaviors leading to a(n) __ state of affairs are stamped out. at annoying; satisfactory b. satisfactory; annoying c. irregular; regular d. regular; irregular 10. Freud believed that humans are motivated to seek pleasure and avoid pain. This notion accords most closely with w__ definition of i a. Thorndike's; the law of effect b. Skinner's; operant conditioning c. Thorndike's; operant conditioning d. Skinner's; the law of effect 11. With his puzzle box experiments, Thorndike discovered that learning is usually a(n) process. a. sudden b. unpredictable c. stressful d. gradual 12. When first setting out to investigate the behavior of animals, Skinner had originally thought that all behavior could be explained in terms of a. thoughts and feelings. b. reflexes. c. operants. 01. fixed action patterns. 13. Skinner's development of the operant conditioning chamber was partly motivated by his desire to find a procedure that yielded w patterns of behavior. a. inflexible b. reflexive 0. regular d. irregular 14. In a standard Skinner box. a % earns food by m. a. rat; pressing a lever b. rat; running in a wheel c. pigeon; pressing a lever d. pigeon; flapping its Wings 15. Skinner's operant conditioning procedure is known as a free operant procedure because the rat a. is put on a free feeding schedule before the experiment starts. b. is free to enter and leave the chamber. 0. is free to move about the chamber. (:1. freely controls the rate at which it responds for food. 16. Skinner's operant conditioning procedure became known as a(n) _____ procedure because instrumental; the consequences are free free operant; the animal is free to control its response rate instrumental; the animal is free to enter or leave the chamber adjunctive; the experimenter is free to observe the rat's behavior 9.057.” Name: 17. 18. 19. 20. 21. 22. 23. 24. 25. In one variant of a Skinner box, a pigeon earns food by a. flapping its wings. b. turning circles. c. pecking a response key. (1. pressing a lever. Rat is to as pigeon is to __ a. lever press; key peck b. key peck; lever press c. turning circles; lever press d. lever press; turning circles Which of the following most Closely parallels what happens in a Skinner box? You are in your apartment with nothing to do except raid the refrigerator. a. b. You are at home watching television and raiding the refrigerator. 0. You are in a hospital with nothing to do. Meals are served at fixed times during the day. d. You are in a hospital with lots to do. Meals are served at fixed times during the day. Skinner divided behaviors into two categories: a. operant and instrumental. b. conditioned and unconditioned. c. primary and secondary. d. operant and respondent. Skinner's restatement of Thorndike‘s law of effect is a. less mentalistic. b. more mentalistic. c. less precise. (1. both a and c The basic components of the operant conditioning process include a. a response that produces a certain consequence. b. a consequence that strengthens or weakens the response. 0. a preceding stimulus that signals the availability of the consequence. d. all of the above The three components of the operant conditioning process include a. a response that is automatically elicited by a preceding stimulus. b. a consequence that strengthens or weakens the response. 0. a preceding stimulus that elicits the response. d. all of the above Properly speaking, operant behavior is said to be w by __H a. emitted; the organism b. elicited; the organism c. emitted; stimuli d. elicited; stimuli The operant response is properly described as a(n) a. emitted behavior. b. contrived behavior. 0. covert behavior. d. elicited behavior. ID: A Name: ID: A 26. Emitted is to elicited as ///////// M conditioning is to conditioning. a. classical; operant b. respondent; classical c. operant; classical d. instrumental; operant 27. The behavior of lever pressing for food is said to be a. elicited by the rat. b. emitted by the rat. 0. elicited by the food. (1. emitted by the food. 28. Operant behaviors are usually defined as a a. class of behaviors that are topographically similar. b. class of behaviors that lead to a certain consequence. c. specific behavior that leads to a certain consequence. d. specific behavior that leads to a class of consequences. 29. Behaviorists have found it useful to define operant behaviors as a a. specific response. b. covert stimulus. c. class of responses. d. unconditioned stimulus. 30. Which of the following conditions must be met for a response to be considered an operant? a. lts occurrence results in the delivery of a certain consequence. b. The consequence affects the future probability of the response. c. The response is elicited by the antecedent stimulus. d. a and b only 31. Properly speaking, when we give a dog a treat for sitting on command, we are attempting to reinforce a. the dog. b. the behavior. 0. the command. d. our relationship with the dog. 32. Properly speaking, when we praise a child for following instructions, we are attempting to reinforce a. the child. b. the instructions. c. the behavior of following instructions. d. both b and c 33. From an operant conditioning perspective, chocolate is a reinforcer if it a. strengthens the behavior that follows it. b. strengthens the behavior that precedes it. c. elicits salivation. (1. both b and c 34. Consequence is to process as __ is to ____ a. reinforcer; punisher b. reinforcement; punishment c. punisher; punishment d. reinforcement; reinforcer Name: ID: A as consequence is to refers to a process or procedure. Suppose a rat presses a lever and receives a food pellet. As a result, it is more likely to press the lever in the future. In this example, the food is functioning as a for lever pressing. c. decreases the probability of a behavior. Suppose a rat runs in a wheel and receives a food pellet. The subsequent increase in wheel running as a result Sam received a traffic fine for speeding the other day. The traffic fine is a for Sam's behavior of d. none of the above; further information is needed to determine the answer When Leena finished her homework, her mother gave her some apple pie. This is obviously an example of d. It is impossible to know given the information provided. 35. Procedure is to a. reinforcer; reinforcement b. reinforcement; punisher c. punishment; reinforcement d. reinforcer; punisher 36. The term a. reinforcer b. reinforcement c. punisher 61. both a and c 37. a. reinforcer b. discriminative stimulus c. punisher d. punishment 38. A spanking is a punisher if it a. follows a behavior. b. precedes a behavior. d. both a and o 39. of the food delivery is an example of a. an establishing operation. b. reinforcement. c. a reinforcer. d. punishment. 40. speeding. a. reinforcer b. punisher c. conditioned stimulus 41. a. positive reinforcement. b. negative reinforcement. 0. positive punishment. 42. A dog is given a treat each time it comes when called, and as a result no longer comes when called. The is an example of ___ treat; negative reinforcement treat; a punisher decrease in behavior; a punisher treat; punishment 999‘?” Name: ID: A 43. Maria gives her canary some food each time it flutters its wings. The food is a a. punisher. b. reinforcer. c. discriminative stimulus. d. none of the above; further information is needed to determine the answer 44. Properly speaking, reinforcers and punishers are defined entirely by a. their intensity. b. the probability of their occurrence. c. their effect on behavior. (1. the extent to which they are perceived as pleasant versus unpleasant. 45. Reinforcers and punishers are entirely defined by a. their hedonic value. b. the manner in which they influence behavior. c. the extent to which they are appetitive or aversive. d. both a and c 46. Reinforcers are the kinds of events that we consider pleasant. a. often but not always b. always c. rarely (1. never 47. If a mother kisses her child whenever he breaks a dish and, as a result, he breaks fewer dishes in the future, the kissing would by definition be a a. punisher. b. reinforcement. c. reinforcer. d. punishment. 48. An electric shock is a reinforcer if it a. follows a behavior. b. precedes a behavior. c. increases the probability of a behavior. (:1. both a and c 49. The Withdrawal of reinforcement for a behavior is called a. extinction. b. inhibition. 0. dishabituation. (1. negative punishment. . 50. The dog no longer receives food for begging and therefore stops begging. This is an example of a. blocking. b. punishment. 0. reinforcement. d. extinction. 51. An SD is a stimulus that a. increases the probability of a certain behavior. b. signals that a reinforcer is now available for the behavior. c. decreases the probability of a behavior. d. both a and b Name: ID: A 52. When Hai visits his parents, he whines a lot about how unappreciated he is at work. It seems likely that his parents are ,,,,,,,,,,,, m for whining. a. discriminative stimuli b. reinforcers c. reinforcement d. conditioned stimuli 53. A(n) __ is a stimulus that "sets the occasion for" a behavior. a. CS b. SD c. SR d. S" 54. A(n) stimulus serves as a signal that a response will be followed by a reinforcer. a. operant b. discriminative c. conditioned d. appetitive 55. A restaurant Sign can be viewed as a(n) “for entering the restaurant and getting a hamburger. a. SD b. US c. SR 01. CS 56. A simple way of thinking about the three-term contingency is that you (in correct order) a. notice something, get something, and do something. b. do something, notice something, and get something. 0. get something, notice something, and do something. d. notice something, do something, and get something. 57. In the three-term contingency, the antecedent is the a. reinforcer. b. operant response. c. discriminative stimulus. d. conditioned stimulus. 5 8. In correct order, the three-term contingency consists of a. antecedent, consequence, and behavior. b. antecedent, behavior, and consequence. c. consequence; behavior, and antecedent. d. behavior, antecedent, and consequence. 59. A stimulus which signals that a response will be punished is a(n) a. conditioned stimulus for punishment. b. unconditioned stimulus for punishment. 0. discriminative stimulus. cl. discriminative stimulus for punishment. 60. The statement, "Don't you dare try itl", would for most people be a(n) discriminative stimulus for reinforcement. unconditioned stimulus for fear. discriminative stimulus for punishment. discriminative stimulus for fear. P‘F’P‘?’ Name: 61. 62. 63. 64. 65. 66. 67. 68. 69. ID: A Unlike classically conditioned behavior, operant behavior is a. b. c. d. typically seen as voluntary and flexible. said to elicited by the stimulus. both a and b neither an nor b Unlike classical conditioning, operant conditioning involves a(n) a. b. c. d. S-S—R sequence. is a function of what comes before it. both a and b neither a nor b To determine if operant conditioning is involved, the most critical question to ask is whether the occurrence of the behavior is mostly a function of a. the stimulus that precedes it. b. the stimulus that follows it. c. the person. d. the environment. A contingency of reinforcement means that a. a response is followed by a reinforcer. b. a reinforcer is followed by a response. c. a response is elicited by a reinforcer. d. a response is elicited by an SD. When combined with the terms reinforcement or punishment, the word positive means a. something that is appetitive. b. something that is subtle. c. something is added or presented. d. both a and b When combined with the terms reinforcement or punishment, the word negative means a. something that is good. b. something that is intense. c. something that is unpleasant. d. something is subtracted or withdrawn. With respect to the four types of contingencies, add is to subtract as _ is to __ a. desire; hate b. positive; negative c. negative; positive (1. hate; desire Increase is to decrease as is to _m a. reinforcement; punishment b. punishment; reinforcement c. antecedent; consequence d. consequence; antecedent The term refers to the presentation of a stimulus following a response which then leads to an increase in the future strength of that response. a. positive reinforcement b. negative reinforcement c. positive punishment d. negative punishment Name: ID: A 70. The term positive reinforcement refers to the MMMMMM m ofa stimulus following a response which then leads to a n) in the future strength of that response. a. removal; increase b. presentation; decrease c. presentation; increase d. removal; decrease 71. The pigeon pecks a response key and receives food. As a result, the probability of key pecking increases. This is an example of a. positive reinforcement. b. negative reinforcement. 0. negative punishment. d. positive punishment. 72. Andre praises his young daughter for being assertive, after which she becomes even more assertive. This is an example of a. negative reinforcement. b. positive reinforcement. c. negative punishment. (1. positive punishment. 73. John yells at his dog whenever it barks. As a result, the dog begins barking even more frequently. This is an example of a. positive punishment. b. negative reinforcement. c. negative punishment. (1. positive reinforcement. 74. Paula spanks her child when he breaks a dish. As result, he breaks dishes even more frequently. This is an example of a. negative punishment. b. negative reinforcement. c. positive reinforcement. d. positive punishment. 75. The term refers to the removal of a stimulus following a response which then leads to an increase in the future strength of that response. a. positive reinforcement b. negative reinforcement c. positive punishment (1. negative punishment 76. The term negative reinforcement refers to the of a stimulus following a response which then leads to a(n) in the future strength of that response. a. removal; increase b. presentation; decrease c. presentation; increase d. removal; decrease Name: ID: A 77. When I banged on the heating pipe. it stopped making a noise. The next time I heard that noise, I immediately banged on the pipe. This seems to be an example of a. positive reinforcement. b. negative reinforcement. C. positive punishment. d. negative punishment. 78. Jason's mother tells him: "If you clean your room, you won't have to do the dishes." What type of contingency is she attempting to apply? a. positive reinforcement b. negative reinforcement c. positive punishment d. negative punishment 79. "I'll do anything to avoid housework." This statement speaks to the power of a. positive reinforcement. b. negative reinforcement. c. positive punishment. d. negative punishment. __ 80. A(n) response occurs before the aversive stimulus is presented and thereby prevents its delivery. a. escape b. avoidance c. reflexive d. primary 81. A(n) __ response results in the termination of an aversive stimulus. a. escape b. avoidance c. reflexive d. primary 82. Putting on a heavy parka before going out into the cold is an example of a(n) response, while putting it on after you go outside and become cold is an example of a(n) response. a. operant; reflexive b. avoidance; escape c. escape; avoidance d. reflexive; operant 83. The term refers to the presentation of a stimulus following a response which then leads to an decrease in the future strength of that response. a. positive reinforcement b. negative reinforcement 0. positive punishment (1. negative punishment 84. The term positive punishment refers to the of a stimulus following a response which then leads to a(n) in the future strength of that response. a. removal; increase b. presentation; decrease c. presentation; increase d. removal; decrease l0 Name: ID: A 85. Jim compliments his secretary on her sexy new outfit when she offers to bring him coffee one morning. She never again offers to bring him coffee. Out ofthe following, this is an example ofwhich type ofprocess? a. positive reinforcement b. negative reinforcement c. positive punishment d. negative punishment 86. When Pedro punched his sister, she punched him back. He never again punched her. This seems to be an example of what process? a. positive punishment b. negative reinforcement 0. positive reinforcement d. negative punishment 87. When Pedro teased his sister, she hugged him. He never again teased her. This seems to be an example of what process? a. positive reinforcement b. negative reinforcement c. positive punishment d. negative punishment 88. A stimulus that can serve as a negative reinforcer can probably also serve as a a. negative punisher. b. positive punisher. 0. positive reinforcer. d. unconditioned reinforcer. 89. Most people would be least likely to volunteer for an experiment on a. positive reinforcement. b. appetitive conditioning. 0. positive punishment. (1. negative punishment. 90. The term refers to the removal of a stimulus following a response which then leads to an decrease in the future strength of that response a. positive reinforcement b. negative reinforcement 0. positive punishment d. negative punishment 91. The term negative punishment refers to the of a stimulus following a response which then leads to a(n) in the future strength of that response. a. removal; increase b. removal; decrease c. presentation; increase d. presentation; decrease 92. Melissa stayed out past her curfew and subsequently lost car privileges for a week. As a result, she never again stayed out past her curfew. This example best illustrates the process of a. positive reinforcement. negative reinforcement. positive punishment. negative punishment. 9.0.5" ll Name: ID: A 94. 95. 96. 97. 98. 99. 100. Felix swore at his girlfriend during an argument one day, after which she wouldn't talk to him for a week. As a result, he became much less likely to swear at her. This is best described as an example of a. positive reinforcement. b. negative reinforcement. c. positive punishment. d. negative punishment. The use of punishment can be quite seductive in that its delivery is often followed by w for the person who delivered it. a. immediate positive reinforcement b. immediate negative reinforcement c. delayed positive punishment (1. delayed...
View Full Document

{[ snackBarMessage ]}