Problem-Solving
Problem-solving refers to the mental activity people undertake to reach a goal. Educational psychologist James Greeno identified three kinds of problems.
- The first, called inducing structure, involves teasing out relations among numbers, words, symbols, or ideas. Some examples are solving math problems or making analogies.
- The second type of problem involves arrangement, or organizing parts like unscrambling letters to find a word or piecing together a puzzle.
- The third type of problem requires transformation: the solver carries out or creates a sequence of changes to arrive at a solution. One example of a transformation problem is the classic riddle in which a farmer has to transport a fox, a chicken, and a sack of corn across a river. The farmer can transport only one at a time. Left unsupervised, the chicken will eat the corn, and the fox will eat the chicken. Solving this riddle requires a person to work through a series of steps. It also requires the problem solver to recognize that some items need to cross the river more than once.
Sometimes problems are mixed or fall outside Greeno's categorization altogether.
There are many barriers to effective problem-solving. The inability to recognize irrelevant information can distract people from arriving at a solution. Functional fixedness refers to the inability to recognize that an object can be used in nontraditional ways, such as using a wrench as a paperweight or to prop open a door. Rigid thinking (or a fixed perspective that is stubbornly maintained) also inhibits problem-solving. A trial-and-error approach in which people randomly try different approaches can solve many problems. However, an algorithm, a step-by-step approach guided by logic, can be more efficient. Unlike humans, computers can run complex algorithms at lightning speed. This allows computers to beat the world's best human chess players.Hiring Algorithm
Heuristics and Biases
A heuristic is a mental shortcut that simplifies decisions by replacing a difficult question with an easier one. Heuristics are innate, or hardwired into the human brain. Heuristics often work well for simple or fast decisions. However, heuristics often lead to biased decision-making.
The affect heuristic refers to a mental shortcut in which a person makes a decision based on emotions rather than a reasoned evaluation of potential risks and rewards. For example, a medical procedure described as having a 3 in 10 chance of leading to serious side effects feels risky. A medical procedure described as having a 7 in 10 chance of being free of serious side effects feels safe. The actual odds of serious side effects are identical, but the first presentation triggers fear, making people less likely to choose the treatment.
The availability heuristic involves relying on the most easily available information to make a decision rather than making use of all available information. For example, people assume there are more English words that begin with k than words that have k as the third letter. It is far easier to think of words that begin with k. Intense media coverage of shark attacks may lead beachgoers to fear sharks more than the much greater risk of drowning. Companies depend on the availability heuristic when they bombard markets with advertising in the hopes that their product will be the first to come to mind.
The representative heuristic involves making assumptions about individual persons or things based on mental prototypes. Stereotyping makes use of this heuristic. For example, most people would not expect a construction worker to be an opera fan, since people associate appreciation of "high art" with the upper classes.
Biases are also part of people's schemas and can skew their pictures of the world. A fundamental attribution error occurs when someone weighs a person's character traits more highly than context and environment when judging the causes of behavior. For example, people may assume a driver ran a red light because he is a reckless jerk, overlooking the fact that the driver had the sun glaring in his eyes.
The actor-observer effect occurs when people attribute their own behavior to context but attribute the behavior of others to personality traits. For example, a student may believe they failed a test because it was unreasonably hard but think their annoying roommate failed due to laziness or stupidity. The self-serving bias involves attributing success to personal traits but failure to outside forces. For example, someone might believe they passed a test because they are smart but believe they failed it because the instructor doesn't like them.
Hindsight bias refers the perception of knowing all along what the outcome of a particular event would be. For example, after an underdog team wins a game, its fans may claim that they saw the victory coming. Hindsight bias may even distort people's memory of earlier predictions. A fan may truly believe that they'd expected their team to win all along, even if they'd passed up tickets to the game a week earlier because they didn't want to watch their team get crushed. Confirmation bias occurs when a person considers only evidence that supports their view. For example, someone with a strong policy opinion may discount research showing that the policy is unwise by assuming the researchers are biased or incompetent.
Decision-Making Approaches
Problem-solving and decision-making go hand in hand, and sometimes they are practically synonymous. Decision-making involves evaluating alternatives and choosing a course of action. When people do not use heuristics or intuition to make decisions, they rely on reasoning. Deductive reasoning begins with an assumption or hypothesis, which is then tested against possible evidence. For example, all birds have feathers and cardinals are birds, so cardinals must have feathers. Inductive reasoning begins with observations and ends with a generalization or hypothesis that fits the observations. For example, a person who breaks out in hives each time they eat shellfish will likely conclude that they are allergic to shellfish. This is a scientific approach, but in everyday life, people's assumptions are often based on limited evidence. People often confine their observations to things that confirm preexisting biases. Recent advances in cognitive neuroscience indicate that decisions are made by the unconscious mind before they appear in the conscious mind. This calls into question the idea that people are conscious authors of their own lives.
Psychologists have identified two types of thinking. During fast thinking (system 1 thinking), the mind makes use of instinct, intuition, schema, and heuristics to reach rapid conclusions. For example, a pedestrian sees a car careening down the street and dives out of the way. In theory, human beings employ slow thinking (system 2 thinking), or careful, analytical reasoning to make complex decisions. In practice, people make many logical errors because system 1 thinking often guides or is more influential than system 2 thinking. For example, people are often fooled by the way a problem is framed. Consumers are more likely to buy 90 percent fat-free products than products advertised as containing 10 percent fat. These two statements convey the same information, but the framing makes all the difference. The positive spin in the first statement is much more appealing to a decision-maker. Economist Herbert Simon suggests that "bounded rationality" influences people's decision-making ability. He hypothesized that people are limited by both cognitive abilities and knowledge. Thus, they settle for "good enough" decisions. Behavioral economist Richard Thaler later suggested that willpower is also bounded, partly because people pay more attention to present rather than future concerns.