Jump to content

YouTube just banned supremacist content, and thousands of channels are about to be removed


steven36

Recommended Posts

YouTube is trying to reduce the prevalence of extremist content on the platform

 

470 c

 

 

YouTube is changing its community guidelines to ban videos promoting the superiority of any group as a justification for discrimination against others based on their age, gender, race, caste, religion, sexual orientation, or veteran status, the company said today. The move, which will result in the removal of all videos promoting Nazism and other discriminatory ideologies, is expected to result in the removal of thousands of channels across YouTube.

 

“The openness of YouTube’s platform has helped creativity and access to information thrive,” the company said in a blog post. “It’s our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence.”

 

The changes announced on Wednesday attempt to improve its content moderation in three ways. First, the ban on supremacists will remove Nazis and other extremists who advocate segregation or exclusion based on age, gender, race, religion, sexual orientation, or veteran status. In addition to those categories, YouTube is adding caste, which has significant implications in India, and “well-documented violent events,” such as the Sandy Hook elementary school shooting and 9/11. Users are no longer allowed to post videos saying those events did not happen, YouTube said.

 

Second, YouTube said it would expand efforts announced in January to reduce the spread of what it calls “borderline content and harmful misinformation.” The policy, which applies to videos that flirt with violating the community guidelines but ultimately fall short, aims to limit the promotion of those videos through recommendations. YouTube said the policy, which affects videos including flat-earthers and peddlers of phony miracle cures, had already decreased the number of views that borderline videos receive by 50 percent. In the future, the company said, it will recommend videos from more authoritative sources, like top news channels, in its “next watch” panel.

 

Finally, YouTube said it would restrict channels from monetizing their videos if they are found to “repeatedly brush up against our hate speech policies.” Those channels will not be able to run ads or use Super Chat, which lets channel subscribers pay creators directly for extra chat features. The last change comes after BuzzFeed reported that the paid commenting system had been used to fund creators of videos featuring racism and hate speech.

 

In 2017, YouTube took a step toward reducing the visibility of extremists on the platform when it began placing warnings in front of some videos. But it has come under continued scrutiny for the way that it recruits followers for racists and bigots by promoting their work through recommendation algorithms and prominent placement in search results. In April, Bloomberg reported that videos made by far-right creators represented one of the most popular sections of YouTube, along with music, sports, and video games.

 

At the same time, YouTube and its parent company, Alphabet, are under growing political pressure to rein in the bad actors on the platform. The Christchurch attacks in March led to widespread criticism of YouTube and other platforms for failing to immediately identify and remove videos of the shooting, and several countries have proposed laws designed to force tech companies to act more quickly. Meanwhile, The New York Times found this week that YouTube algorithms were recommending videos featuring children in bathing suits to people who had previously watched sexually themed content — effectively generating playlists for pedophiles.

 

YouTube did not disclose the names of any channels that are expected to be affected by the change. The company declined to comment on a current controversy surrounding my Vox colleague Carlos Maza, who has repeatedly been harassed on the basis of his race and sexual orientation by prominent right-wing commentator Steven Crowder. (After I spoke with the company, it responded to Maza that it plans to take no action against Crowder’s channel.)

 

Still, the move is likely to trigger panic among right-wing YouTube channels. In the United States, conservatives have promoted the idea that YouTube and other platforms discriminate against them. Despite the fact that there is no evidence of systematic bias, Republicans have held several hearings over the past year on the subject. Today’s move from YouTube is likely to generate a fresh round of outrage, along with warnings that we are on the slippery slope toward totalitarianism.

 

Of course, as the Maza case has shown, YouTube doesn’t always enforce its own rules. It’s one thing to make a policy, and it’s another to ensure that a global workforce of underpaid contractors accurately understands and applies it. It will be fascinating to see how the new policy, which prohibits “videos alleging that a group is superior in order to justify ... segregation or exclusion,” will affect discussion of immigration on YouTube. The company says that political debates about the pros and cons of immigration are still allowed, but a video saying that “Muslims are diseased and shouldn’t be allowed to migrate to Europe” will be banned.

 

The changed policy goes into effect today, YouTube said, and enforcement will “ramp up” over the next several days.

 

Source

Link to comment
Share on other sites


  • Replies 5
  • Views 510
  • Created
  • Last Reply
TheEmpathicEar

It's about time. After what happened in Charlottesville, they should have been all over this.

Link to comment
Share on other sites


Moved from Technology News.

 

(Not really Technology, nor Security & Privacy. Better here.)

Link to comment
Share on other sites


A dark day for freedom. This basically means it will be deleting anyone who is to the right of centre left. Whilst many actual extremists ARE harmful, we all know this will be aiming at the likes of Sargon, Rebel Media, Lauren Southern, Avi Yemini and Paul Joseph Watson who are liberals and so dare to speak out against leftist intersectionality.

The leftists who would welcome this ought to think about if it was the likes of leftists who preach "racism!" at every moment being banned for "making people feel that there is racism". You would rightly think it is wrong. So is this.

Link to comment
Share on other sites


They wish to be 'seen' to be doing the right thing, Pootube D.G.A.S. quite frankly....it`s google after all, they care about nothing but their ad revenue profit.

Link to comment
Share on other sites


13 hours ago, xpkRAKE said:

They wish to be 'seen' to be doing the right thing, Pootube D.G.A.S. quite frankly....it`s google after all, they care about nothing but their ad revenue profit.

Yes your right and i find  it funny people think Google have anything to do with freedom when there nothing but surveillance capitalism.  Google has been  banning   supremacist content  for years just not on a mass scale like this .  So it was not like we didn't see it coming . That don't mean it goes away that just means it goes to some other sites that don't ban it . Same as  when they ban others  like Evalion before they just made her famous  was all . So sometimes banning  users have the opposite effect . YouTube and Google  are so crazy they even censor themselves .

 

It just like when they gave  away free movies they took and  replaced R rated  versions with made for TV versions  . Welcome to YouTube China .  :lmao:

https://www.youtube.com/movies?list_id=PLHPTxTxtC0ibVZrT2_WKWUl2SAxsKuKwx

 

While i don't agree with YouTube killing freedom of speech on there platform  i could care less what they do  because Google  are a private Tech company there not owned  by no Government  so they have the freedom  to run there websites  as they see fit  . Just like you have the freedom to go to some other  video site  that still allows  supremacist garbage if you want. If the Government steps in  and passes laws on how people should  run there websites that will be a dark day for freedom . But for now if you don't like something a site does you can just go get you videos and news  from another site. 

 

16 hours ago, TheEmpathicEar said:

It's about time. After what happened in Charlottesville, they should have been all over this.

What YouTube  does  have no barring with what goes on offline or do it stop rallies like this . In the USA  they give supremacist  a permit to hold rallies witch overall have been peaceful from the 1980s - 2000s  . You act like this is something new  when violent  rallies  use to happen  all the time back  before  1950s -1970s . These are just meetings people see, this is not counting the 1000s of meetings  they do were the public is not invited  .  If they take supremacist groups freedom of speech away they will  just  form and attack on the public  like they done before . Next thing you know  they will be burning a cross in your yard again if they cant speak up legally.  Big Tech companies pushing it  under the rug like it don't exist don't  make  it go away and the EU passing laws to have the internet purged  of  hate  speech want make it go away . It just shows the world  has lost touch with reality and don't know the difference in a online problem  and  a offline problem   These groups are  trained in Guerrilla  warfare they have machine guns   and explosives. So they could start a war  if they wanted.  Most of these groups recruit  from the prison systems they don't need the internet to exist they was around 100s  of years before the internet  even. :rofl:

 

Here is  a example  of what  YouTube  does online dont matter in the USA

 

White Supremacists Keep Beating The Federal Government In Court

https://www.huffpost.com/entry/rise-above-movement-dismissed_n_5cf6d903e4b036433477536b

 

Quote

 

A Los Angeles judge decreed that members of a white supremacist group were protected by constitutional free speech despite their advocacy for violence.

 

 

 

 

Before  they claimed  reddit banning  these type of subs worked

https://arstechnica.com/science/2017/09/reddits-campaign-against-hate-speech-worked/

 

There so full of :shit: they have it coming out there  ears they  just all  gang up on other  social media platforms that don't care  looks like half them went to YouTube even. It just like  in the workplace they dictate what there employees  can  say  , so no one at work can  talk about this kind of stuff  and if they do they get fired,  but as soon  as they punch the clock and leave they talk about what they want , so it only stops it  on there  property ,it don't stop it on someone else's   property .  Next thing you know they will   go to PeerTube  are something were you can start your own instance . As far as i know people never had any freedom when it comes  to other peoples property unless  the owner want to give it to you. All places have rules  and rules are subject to change  at anytime   . If you don't like a place's  rules you can always leave so why would you think the internet would be any different  than in real life? All it takes is a resentment  and a group of people to start a  new website, but  you may not like there rules ether.

 

Reddit is so full of Trolls i don't even sign in much there  anymore and all the  extremist from the left and the right are over on other sites kind of like Reddit now . Reddit is a very unfriendly site  . If someone disagree with you  they all will  attack you and mark your post down witch will make your post  be hidden  in a spoiler. So your not allowed to have a opinion  that is different than the subs and they will  talk to the mods and have you ban.  That's like if you go in the Linux  sub and say boy i hate  Linux  they will attack you  and have you banned, because that's hate speech . 🤣

Link to comment
Share on other sites


Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...