nsane.forums Posted May 29, 2013 Share Posted May 29, 2013 Under pressure from womens' rights groups, Facebook pledges to make changes.Facebook has announced plans to renew its effort toward monitoring, and where appropriate, removing gender-related hate speech from its users, per a post on the company’s Facebook Safety page Tuesday. In its most recent battle, Facebook appears to be trying to differentiate what is “cruel and insensitive” and what is “distasteful humor” in order to answer complaints from groups including Women, Action, and the Media.WAM wrote an open letter to Facebook on May 21 that asserted the company seems to apply its hate speech mandates unevenly when that hate speech is gender-based. The group cites several Facebook fan pages, including “Fly Kicking Sluts in the Uterus” and “Raping your Girlfriend,” which have now been removed but were presumably present at the time of WAM’s writing.WAM claims that pages like these and others that constitute hate speech toward women are allowed to exist while similar hate speech pages based on religion, race, and sexual orientation are quickly moderated. WAM cites hateful images or content that get a media spotlight as the exception:You have also acted inconsistently with regards to your policy on banning images, in many cases refusing to remove offensive rape and domestic violence pictures when reported by members of the public, but deleting them as soon as journalists mention them in articles, which sends the strong message that you are more concerned with acting on a case-by-case basis to protect your reputation than effecting systemic change and taking a clear public stance against the dangerous tolerance of rape and domestic violence.Facebook explicitly mentions WAM in its response and acknowledges that its rules on hate speech may be unevenly applied when the content is gender-based. “In recent days, it has become clear that our systems to identify and remove hate speech have failed to work as effectively as we would like, particularly around issues of gender-based hate,” Facebook wrote.The company dances around the issue of defining hate speech versus insensitive humor at length and without any real conclusion within the post. “Humor” is cited twice as a confounding factor in what is and is not hate speech. The confusion over that distinction is one the Internet has been painfully aware of lately, with uproar over a rape joke told by Daniel Tosh that sparked discussion on jokes on rape versus jokes on rape culture, as well as criticism of the broadly rape-themed comedy of Sam Morril.“We work hard to remove hate speech quickly, however there are instances of offensive content, including distasteful humor, that are not hate speech according to our definition,” Facebook says. “In these cases, we work to apply fair, thoughtful, and scalable policies.”Facebook most clearly applies discipline, it says, when the hate speech is oriented towards action: for instance, a page used to plan hate crimes.But the company noted this standard is not evenly applied, and a lot of non-gender-based discriminatory content gets removed even if it’s not specifically organizing action and is just non-specific, knuckle-dragging hatred. The post states that recently, gender-related hate speech content is getting flagged but not removed in a timely fashion. Other times, “content that should be removed has not been or has been evaluated using outdated criteria.”Going forward, Facebook states that it will review and update guidelines that its user operations team uses to identify hate speech. The company will also tweak their training to reflect new standards for what constitutes hate speech. Facebook noted that it is testing a feature that requires a user who says a “hate speech” page is actually merely “cruel and insensitive humor” to associate it with their “authentic identity” in order for it to remain on Facebook.“These are complicated challenges and raise complex issues,” Facebook said of the problems its woman-hating users have raised. The company will work with WAM and Everyday Sexism to “identify resources or highlight areas of particular concern for inclusion in the training” of employees in charge of targeting and removing the offending content.View: Original Article Link to comment Share on other sites More sharing options...
This topic is now archived and is closed to further replies.