Jump to content
  • Google Finds ‘Inoculating’ People Against Misinformation Helps Blunt Its Power

    aum

    • 260 views
    • 6 minutes
     Share


    • 260 views
    • 6 minutes

    British researchers and a team from Google found that teaching people how to spot misinformation made people more skeptical of it.

     

    In the fight against online misinformation, falsehoods have key advantages: They crop up fast and spread at the speed of electrons, and there is a lag period before fact checkers can debunk them.


    So researchers at Google, the University of Cambridge and the University of Bristol tested a different approach that tries to undermine misinformation before people see it. They call it “pre-bunking.”


    The researchers found that psychologically “inoculating” internet users against lies and conspiracy theories — by pre-emptively showing them videos about the tactics behind misinformation — made people more skeptical of falsehoods afterward, according to an academic paper published in the journal Science Advances on Wednesday. But effective educational tools still may not be enough to reach people with hardened political beliefs, the researchers found.


    Since Russia spread disinformation on Facebook during the 2016 election, major technology companies have struggled to balance concerns about censorship with fighting online lies and conspiracy theories. Despite an array of attempts by the companies to address the problem, it is still largely up to users to differentiate between fact and fiction.


    The strategies and tools being deployed during the midterm vote in the United States this year by Facebook, TikTok and other companies often resemble tactics developed to deal with misinformation in past elections: partnerships with fact-checking groups, warning labels, portals with vetted explainers as well as post removal and user bans.


    Social media platforms have made attempts to pre-bunk before, though those efforts have done little to slow the spread of false information. Most have also not been as detailed — or as entertaining — as the videos used in the studies by the researchers.


    Twitter said this month that it would try to “enable healthy civic conversation” during the midterm elections in part by reviving pop-up warnings, which it used during the 2020 election. Warnings, written in multiple languages, will appear as prompts placed atop users’ feeds and in searches for certain topics.


    The new paper details seven experiments with almost 30,000 total participants. The researchers bought YouTube ad space to show users in the United States 90-second animated videos aiming to teach them about propaganda tropes and manipulation techniques. A million adults watched one of the ads for 30 seconds or longer.


    The users were taught about tactics such as scapegoating and deliberate incoherence, or the use of conflicting explanations to assert that something is true, so that they could spot lies. Researchers tested some participants within 24 hours of seeing a pre-bunk video and found a 5 percent increase in their ability to recognize misinformation techniques.


    One video opens with a mournful piano tune and a little girl grasping a teddy bear, as a narrator says, “What happens next will make you tear up.” Then the narrator explains that emotional content compels people to pay more attention than they otherwise would, and that fear-mongering and appeals to outrage are keys to spreading moral and political ideas on social media.


    The video offers examples, such as headlines that describe a “horrific” accident instead of a “serious” one, before reminding viewers that if something they see makes them angry, “someone may be pulling your strings.”


    Beth Goldberg, one of the paper’s authors and the head of research and development at Jigsaw, a technology incubator within Google, said in an interview that pre-bunking leaned into people’s innate desire to not be duped.


    “This is one of the few misinformation interventions that I’ve seen at least that has worked not just across the conspiratorial spectrum but across the political spectrum,” Ms. Goldberg said.


    Jigsaw will start a pre-bunking ad campaign on YouTube, Facebook, Twitter and TikTok at the end of August for users in Poland, Slovakia and the Czech Republic, meant to head off fear-mongering about Ukrainian refugees who entered those countries after Russia invaded Ukraine. It will be done in concert with local fact checkers, academics and disinformation experts.


    The researchers don’t have plans for similar pre-bunking videos ahead of the midterm elections in the United States, but they are hoping other tech companies and civil groups will use their research as a template for addressing misinformation.


    However, pre-bunking is not a silver bullet. The tactic was not effective on people with extreme views, such as white supremacists, Ms. Goldberg said. She added that elections were tricky to pre-bunk because people had such entrenched beliefs. The effects of pre-bunking last for only between a few days and a month.


    Groups focused on information literacy and fact-checking have employed various pre-bunking strategies, such as a misinformation-identifying curriculum delivered over two weeks of texts, or lists of bullet points with tips such as “identify the author” and “check your biases.” Online games with names like Cranky Uncle, Harmony Square, Troll Factory and Go Viral try to build players’ cognitive resistance to bot armies, emotional manipulation, science denial and vaccine falsehoods.


    A study conducted in 2020 by researchers at the University of Cambridge and at Uppsala University in Sweden found that people who played the online game Bad News learned to recognize common misinformation strategies across cultures. Players in the simulation were tasked with amassing as many followers as possible and maintaining credibility while they spread fake news.


    The researchers wrote that pre-bunking worked like medical immunization: “Pre-emptively warning and exposing people to weakened doses of misinformation can cultivate ‘mental antibodies’ against fake news.”


    Tech companies, academics and nongovernmental organizations fighting misinformation have the disadvantage of never knowing what lie will spread next. But Prof. Stephan Lewandowsky from the University of Bristol, a co-author of Wednesday’s paper, said propaganda and lies were predictable, nearly always created from the same playbook.


    “Fact checkers can only rebut a fraction of the falsehoods circulating online,” Mr. Lewandowsky said in a statement. “We need to teach people to recognize the misinformation playbook, so they understand when they are being misled.”

     

    Source

     

    [Note:  Registration or eMail address is required.]


    User Feedback

    Recommended Comments

    There are no comments to display.



    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...