This week, Facebook earned a lot of good press when they decided to finally adjust their community standards. Previously, photos of breastfeeding women would get deleted after one or two people reported them as inappropriate, while entire Facebook pages promoting violence against women and rape went unchallenged, no matter how many people reported them. When Women's Media Center confronted Facebook on this hypocritical interpretation of their terms of service and organized an advertiser boycott of the company, many were shocked to see Facebook actually respond! And announce a major change in how the site deals with problematic content! Facebook revealed that they would no longer censor photos of breastfeeding mothers, and they would be more active in deleting content that promotes or makes light of violence against women.
Is this new policy actually working, though? Does Facebook have a system in place that allows site users to report problematic content and see it deleted?
Unfortunately, it became clear yesterday that Facebook has not figured out how to respond to content that violates its community standards. It also became clear that the only real way to get Facebook to remove violent sexual content is to make the masses aware of that violent sexual content.
On Tuesday, Liz Boltz Ranfeld, a writer and English professor, stumbled across a now-deleted page for the red light district of Sonagachi in Kolkata, India, while working on a series of posts about her recent volunteer work there. Shocked to find a “community” page that openly promoted prostitution, had graphic photographs of prostitutes engaged in sexual acts, and included some photos of women that appeared to be underaged, she sent a message to a small group of friends in order to ask them to report the site. Slay Belle, who is an editor and author for Persephone Magazine (where Liz is also a contributor) and a contributor for Jezebel's Powder Room, suggested bringing the page to the attention of a larger group of mutual friends and acquaintances.
Within a few hours, everyone who had reported the Facebook page received the same message from a Facebook staffer or robot called Viki. The message explained that Facebook could not verify that the Sonagachi community page violated the site's community standards. No one who received this message could believe what they were reading. If pictures that showed a prostitute performing a blow job didn't violate Facebook's TOS, what did?
Disheartened, Liz and Slay both turned to their respective social media outlets to tell others about Facebook’s inaction. Liz posted an article to her personal blog, and Slay posted to the Powder Room blog at Jezebel.com. They encouraged their friends and readers to spread the word that Facebook was ignoring blatant pornographic material and sex trafficking right there on its own site. In addition, Slay, Liz, and many others reached out through Twitter to Women's Media Center, Feministing, Jezebel, and individuals such as Jessica Valenti and Hugo Schwyzer, asking them to promote the response that Facebook had been sending out to people who reported the page.
This is where the story takes a turn. Most people, when reporting a Facebook page they find problematic, don't get to spread the word much farther than their immediate circle of Facebook friends. Most people aren't plugged into a network of feminist social media junkies who immediately take up the cause.
Persephone Magazine reran Liz's blog post and sent the story out to their followers on social media. Several Tumblr posts appeared to complain about the “Viki” emails. The story was picked up by a number of feminist bloggers. Hugo Schwyzer re-tweeted to his followers, as did WAM. And still the page remained up, and FB users who were reporting the page or the individual images were getting flagged as 'spam' reporters. A Change.Org petition was started. Liz’s post was picked up by several Christian anti-sex trafficking blogs and Oral Roberts University students. All the while, the hundreds of people who were reporting the page continued to get the same message from Viki: the Sonagachi community page is not in violation of the community standards.
It didn't matter that hundreds of Facebook users were providing feedback that linked directly to photos of women performing oral sex on men, women being advertised with prices for a night of sex, a photo series of a teenager losing her virginity, a toddler-aged child in a sex pose with a young girl, or fans posting questions about what kind of women they could pay to have sex with.
So when did Facebook decide to finally take down the page?
Within minutes of the coverage hitting these two large media outlets, the FB page — which hadn't violated any community standards! — disappeared, with nary a comment from Facebook or Viki about it.
Clearly, the individuals or algorithm responsible for responding to Facebook reports didn't care one bit about naked women, children engaged in sexual behavior, or human sex trafficking until some big names in social media got involved.
In that sense, the media campaign was a success. Facebook did what Slay, Liz, and their friends wanted — they removed a site that openly engaged in sex trafficking. While the removal of the Sonagachi community page from Facebook is a small victory, the story of how it came to be removed is a disheartening one.
Although there is no way for us to know how many people reported the Sonagachi page to Facebook, we estimate several hundred or more, based on blog traffic, links, Twitter, and Facebook conversations.
None of that made a difference. Only when Facebook faced bad PR and backlash from real names in social media did they follow their own terms of service and remove the Sonagachi Facebook page.
By the time Liz stumbled across the page, the site had been live for more than a year, and pimps advertising prostitutes had been active just a day before. And unfortunately, there are countless more pages in that one's place. As readers and commenters began tracking down and reporting other violation-ridden Sonagachi Facebook pages (NSFW!), the story began to repeat itself. More messages from Facebook's Viki were sent to every person who reported any of the sites, all of which have similar violations as the original. Vicki's message remained the same — these pages could not be determined to violate their own terms of service.
Does Facebook need every single problematic image or site to be broadcast by major social media players for them to take action?
Must their reputation be at stake for them to remove pages that anyone can see are in clear violation of the community standards?
Have they no protocols in place to have a human being check a page that had been reported hundreds of times within a few hours? If their bots can identify photos of breast cancer survivors or breast feeding mothers as potential pornography, why can't they identify actual pornography?
In a tweet, the Women's Media Center expressed that yes, it will take time for Facebook to right the wrongs in its reporting system. The Sonagachi Facebook page debacle, though, demonstrates that there is still a very, very long way to go.
Update 6/6/13: We've received an email from Facebook's Public Policy Communications team. They state:
To be clear, this [page] was not removed as a result of the blog post, the content absolutely violated our polices however initial reports were not actioned correctly because of a technical error that has been fixed.
This post was co-written by Slay Belle and Liz Boltz Ranfeld. It is being cross-posted to Jezebel’s Powder Room blog and Liz’s personal blog.