In a debate, some people could not simply be wrong, but flat-out wrong for psychological reasons.
A study that was published on Wednesday in the journal Plos One suggests that the root of the problem is thinking you know all you need to know to make an informed decision, even when you don’t.
Angus Fletcher, an English professor at Ohio State University and co-author of the study, stated, “Our brains are overconfident that they can arrive at a reasonable conclusion with very little information.”
Fletcher and two psychologists set out to examine how individuals assess persons or circumstances depending on how confidently they believe the information they have, even if it is incomplete. “People make snap judgments,” he remarked.
The average age of the almost 1,300 participants that the researchers recruited was around 40. Everyone was told a made-up tale about a school that was losing water due to the nearby aquifer drying up.
A version of the piece advocating the school’s merger with another institution was read by about 500 people; it had three reasons in favor of the move and one neutral point.
A piece containing three reasons in favor of maintaining separation plus the same neutral point was read by an additional 500 individuals.
Three pro-merge, three pro-separate, and one neutral argument were presented in a fair and balanced tale to the last 300 participants, who made up the control group.
Following their reading, the participants were questioned by the researchers on their thoughts on the best course of action for the school as well as their level of confidence in their ability to make that decision.
According to the polls, the majority of respondents were far more inclined to concur with the argument they had read, which supported either merging or remaining separate, and they frequently expressed confidence in their ability to gather sufficient data to support that position. Compared to those in the control group who read both arguments, individuals in the groups who had only read one point of view were likewise more likely to indicate they were more confident in their viewpoint.
After reading the material from the opposing side, which was in conflict with the article they had previously read, half of the participants in each group were instructed to read it.
When given all the information, people were frequently open to changing their minds, even if they were sure in their ideas when they had just read reasons in support of one course of action. Additionally, respondents stated that this made them feel less competent to develop an opinion on the matter.
According to Fletcher, the research shows that people don’t always consider whether they have all the information available. “We thought that people would really stick to their original judgments even when they received information that contradicted those judgments,” Fletcher said. However, it turns out that if they learned something that seemed plausible to them, they were willing to totally change their minds.
The researchers did point out that the results might not hold true in circumstances when people have preconceived notions about a subject, which is frequently the case in politics.
According to Fletcher, “people are more receptive to new ideas and more willing to change their minds than we think.” But “long-standing differences, like political beliefs, don’t apply to this same flexibility.”
The results were compared to the “invisible gorilla” study by Todd Rogers, a behavioral scientist at the Harvard Kennedy School of Government. That study showed the psychological phenomenon known as “inattentional blindness,” which occurs when a person is focused on something else and fails to notice something obvious.
According to Rogers, “this study uses information to capture that.” “It appears that there is a cognitive propensity to fail to recognize that the information we have is insufficient.”
According to Barry Schwartz, a psychologist and professor emeritus of social theory and social action at Swarthmore College in Pennsylvania, the study also bears similarities to a psychological phenomena known as the “illusion of explanatory depth,” in which people overestimate their knowledge of a certain subject.
The premise is that most people would probably say they understand how a toilet functions if you ask them. However, they soon come to the realization that they only know how to operate a toilet by pressing a lever when asked to explain how one operates.
Not only are individuals mistaken, but. The issue, according to Schwartz, is that they are so certain of their errors.
He went on, “Being curious and being humble” is the remedy.
Both Schwartz and the researchers found it interesting and reassuring that study participants who were subsequently given additional information were willing to change their beliefs as long as it made sense.
According to Schwartz, “there is a glimmer of hope that, despite people’s convictions about what they know, they are amenable to change after seeing new information.”