Jump to content
  • People think they already know everything they need to make decisions

    Karlston

    • 1 comment
    • 237 views
    • 5 minutes
     Share


    • 1 comment
    • 237 views
    • 5 minutes

    When given partial info, most people felt confident they knew all they needed to.

    The world is full of people who have excessive confidence in their own abilities. This is famously described as the Dunning-Kruger effect, which describes how people who lack expertise in something will necessarily lack the knowledge needed to recognize their own limits. Now, a different set of researchers has come out with what might be viewed as a corollary to Dunning-Kruger: People have a strong tendency to believe that they always have enough data to make an informed decision—regardless of what information they actually have.

     

    The work, done by Hunter Gehlbach, Carly Robinson, and Angus Fletcher, is based on an experiment in which they intentionally gave people only partial, biased information, finding that people never seemed to consider they might only have a partial picture. "Because people assume they have adequate information, they enter judgment and decision-making processes with less humility and more confidence than they might if they were worrying whether they knew the whole story or not," they write. The good news? When given the full picture, most people are willing to change their opinions.

    Ignorant but confident

    The basic setup of the experiment is very straightforward. The researchers developed a scenario where an ongoing water shortage was forcing a school district to consider closing one of its schools and merging its students into another existing school. They then wrote an article that described the situation and contained seven different pieces of information: three that favored merging, three that disfavored it, and one that was neutral. Just over half of the control group that read the full article favored merging the two schools.

     

    The experimental groups were given an edited version of the article, one where all the facts in favor of one of the options were deleted. In other words, some of them read a version containing three facts favoring merging plus one neutral bit of information; others read three facts that favored keeping both schools open plus the neutral info.

     

    After reading, half of the experimental group was given a survey that asked them whether they had enough information to make a decision, their confidence in that decision, and whether they would expect most people to agree with their choice. Statistically, it was impossible to distinguish these people from those in the control group. They believed that they had received all the information needed to make a decision and felt as strongly as the control group that most people would agree with the decision they made. Those who had received the pro-merger version of the article were even more confident than the controls in their decision.

     

    The obvious difference was the decisions they made. In the group that had read the article biased in favor of merging the schools, nearly 90 percent favored the merger. In the group that had read the article that was biased by including only information in favor of keeping the schools separate, less than a quarter favored the merger.

     

    The other half of the experimental population wasn't given the survey immediately. Instead, they were given the article that they hadn't read—the one that favored the opposite position of the article that they were initially given. You can view this group as doing the same reading as the control group, just doing so successively rather than in a single go. In any case, this group's responses looked a lot like the control's, with people roughly evenly split between merger and separation. And they became less confident in their decision.

    It’s not too late to change your mind

    There is one bit of good news about this. When initially forming hypotheses about the behavior they expected to see, Gehlbach, Robinson, and Fletcher suggested that people would remain committed to their initial opinions even after being exposed to a more complete picture. However, there was no evidence of this sort of stubbornness in these experiments. Instead, once people were given all the potential pros and cons of the options, they acted as if they had that information the whole time.

     

    But that shouldn't obscure the fact that there's a strong cognitive bias at play here. "Because people assume they have adequate information, they enter judgment and decision-making processes with less humility and more confidence than they might if they were worrying whether they knew the whole story or not," Gehlbach, Robinson, and Fletcher.

     

    This is especially problematic in the current media environment. Many outlets have been created with the clear intent of exposing their viewers to only a partial view of the facts—or, in a number of cases, the apparent intent of spreading misinformation. The new work clearly indicates that these efforts can have a powerful effect on beliefs, even if accurate information is available from various sources.

     

    PLOS ONE, 2024. DOI: 10.1371/journal.pone.0310216  (About DOIs).

     

    Source


    RIP Matrix | Farewell my friend  :sadbye:

     

    Hope you enjoyed this news post.

    Thank you for appreciating my time and effort posting news every day for many years.

    2023: Over 5,800 news posts | 2024 (till end of September): 4,292 news posts


    User Feedback

    Recommended Comments

    And sadly this mentality is being exploited more and more every day, all over the western world. With talking-heads on Facebook, Twitter and YouTube dropping just enough info (in this case, 3 facts) to convince people that they now have everything they need to know about an issue. Then twisting the narrative, finger-pointing at someone we should all "blame" and having the nerve to cry about everyone being divided.

    Logic, philosophy and reason need to start making comeback, so people can spot and ignore this shit already.

    • Like 2
    Link to comment
    Share on other sites




    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...