Jump to content
  • YouTube Bans All Anti-Vaxx Content in Sweeping Bid to Suppress Misinformation

    aum

    • 567 views
    • 2 minutes
     Share


    • 567 views
    • 2 minutes

    YouTube is banning all anti-vaccine content on its platform, including misinformation about approved vaccines for common illnesses in addition to COVID-19, the company said Wednesday.

     

    The Google-owned social media platform will remove any video that attempts to describe well-known vaccines that are approved by federal health officials as being harmful, it said in a blog post first reported by the Washington Post.

     

    That includes content claiming vaccines can cause autism, cancer, infertility, or can allow the recipient of the vaccine to be tracked via microchip.

     

    YouTube previously had banned false information surrounding the coronavirus vaccines in October 2020. The company said it will still allow discussion around vaccine policies, new vaccine trials, and personal accounts of receiving the vaccine.

     

    A YouTube spokesperson also confirmed to Insider that the company will remove the accounts of high-profile anti-vaxxers like Robert F. Kennedy Jr., the nephew of former President John F. Kennedy, and anti-vaccine activist and author Joseph Mercola.

     

    Kennedy Jr. was one of 12 people that a recent report found to be the most prolific spreaders of COVID-19 disinformation online.

     

    Wednesday's expansion of rules related to vaccine content marks a major change in how the company handles content on its service.

     

    "Developing robust policies takes time," Matt Halprin – YouTube's vice president of global trust and safety – told the Post. "We wanted to launch a policy that is comprehensive, enforceable with consistency and adequately addresses the challenge."

     

    YouTube and other social media companies have long taken a hands-off approach to moderating content.

     

    But pressure has increased from regulators and the general public in recent years, especially amid the pandemic and 2020 presidential election, for platforms to more actively police disinformation on their websites.

     

    Facebook and Twitter have also moved to limit the spread of COVID-19 vaccine misinformation online.

     

    Still, false content has still leaked through – private groups devoted to discussing and taking proven COVID-19 treatments like the horse drug Ivermectin proliferated, Insider reported in early September.

     

    Companies also began cracking down on former President Donald Trump's false statements in 2020, thrusting the topic of social media platforms' content moderation into an ongoing political war.

     

    This article was originally published by Business Insider.

     

    Source


    User Feedback

    Recommended Comments

    There are no comments to display.



    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...