An opinion which is difficult to change




















George had a small son and played golf. According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option. The students were then asked to describe their own beliefs.

What sort of attitude toward risk did they think a successful firefighter would have? The Stanford studies became famous. Thousands of subsequent experiments have confirmed and elaborated on this finding. Rarely has this insight seemed more relevant than it does right now.

Still, an essential puzzle remains: How did we come to be this way? Mercier, who works at a French research institute in Lyon, and Sperber, now based at the Central European University, in Budapest, point out that reason is an evolved trait, like bipedalism or three-color vision. It emerged on the savannas of Africa, and has to be understood in that context.

For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

One of the most famous of these was conducted, again, at Stanford. For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment. Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.

The students were asked to respond to two studies. One provided data in support of the deterrence argument, and the other provided data that called it into question. Both studies—you guessed it—were made up, and had been designed to present what were, objectively speaking, equally compelling statistics.

At the end of the experiment, the students were asked once again about their views. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. A recent experiment performed by Mercier and some European colleagues neatly demonstrates this asymmetry. Participants were asked to answer a series of simple reasoning problems. They were then asked to explain their responses, and were given a chance to modify them if they identified mistakes. The majority were satisfied with their original choices; fewer than fifteen per cent changed their minds in step two.

Once again, they were given the chance to change their responses. About half the participants realized what was going on. Among the other half, suddenly people became a lot more critical. This lopsidedness, according to Mercier and Sperber, reflects the task that reason evolved to perform, which is to prevent us from getting screwed by the other members of our group.

There was little advantage in reasoning clearly, while much was to be gained from winning arguments. Nor did they have to contend with fabricated studies, or fake news, or Twitter. Steven Sloman, a professor at Brown, and Philip Fernbach, a professor at the University of Colorado, are also cognitive scientists.

They, too, believe sociability is the key to how the human mind functions or, perhaps more pertinently, malfunctions. Is it possible to manipulate memory formation and consolidation? And why would we want to do it? We explore these questions in this Spotlight feature. Why is it so difficult to make people change their minds? Written by Maria Cohut, Ph. Share on Pinterest Why can it be so hard for people to change their minds?

Confirmation bias at work. Scientists identify new cause of vascular injury in type 2 diabetes. Adolescent depression: Could school screening help? Related Coverage. How different are men's and women's brains? Second Opinion. See our full republication guidelines for more information.

To republish, copy the HTML at right, which includes our tracking pixel, all paragraph styles and hyperlinks, the author byline and credit to MinnPost. If you have questions, email editors minnpost.



0コメント

  • 1000 / 1000