Skip to content

Facebook tells about how it ‘reviews the content’ after New York Times’s claims of it having ‘errors and biases’

After the claims made by New York Times in its report about Facebook that some of its rulebooks have ‘numerous gaps, biases and outright errors’, the social networking giant has come forward in its defense through a blog post, by clarifying that its methods and policies for reviewing and moderating

Facebook tells about how it ‘reviews the content’ after New York Times’s claims of it having ‘errors and biases’
Facebook tells about how it ‘reviews the content’ after New York Times’s claims of it having ‘errors and biases’

After the claims made by New York Times in its report about Facebook that some of its rulebooks have ‘numerous gaps, biases and outright errors’, the social networking giant has come forward in its defense through a blog post, by clarifying that its methods and policies for reviewing and moderating content on the site are not ‘secret’, and are ‘carefully considered’.

NYT has claimed to have accessed around 1,400 pages from the rulebook used by the moderators for monitoring posts on the social networking platform and tackling issues of extremism, hate, etc in different countries. But as per the article published by NYT, there are highly complex issues, distilled into simple yes-or-no rules, eventually resulting in errors, while moderating the content properly.

The article also cites examples, like the one from India, where moderators were told to take down the comments which were critical of religion in the country. Another one is of Myanmar where a famous extremist group was allowed to stay on the platform for months due to some error in paperwork. And this error was admitted by Facebook.

The NYT article read:

“The Facebook employees who meet to set the guidelines, mostly young engineers and lawyers, try to distill highly complex issues into simple yes-or-no rules.”

“Those moderators, at times relying on Google Translate, have mere seconds to recall countless rules and apply them to the hundreds of posts that dash across their screens each day,” it further added.

The article says that the rules for what the 2 billion users should be allowed to say on the platform are discussed ‘over breakfast’, every Tuesday morning, and are then circulated to over 7,500 moderators globally.

In response to this, Facebook, in its blog post, said that its gathering ‘over the breakfast’ is, in fact, a global forum, being attended by experts from across the globe, having deep knowledge of relevant laws, online safety, counter-terrorism, operations, public policy, communications, product, and diversity. It also said that the meeting also includes human rights experts.

Facebook also told that it has around 15,000 content reviewers around the world, and they are supplied with training and supporting resources, rather than being relied upon Google Translate. According to the social networking giant, it reviews the content in more than 50 languages.

Source: New York Times

Author

Daniel Jack

For Daniel, journalism is a way of life. He lives and breathes art and anything even remotely related to it. Politics, Cinema, books, music, fashion are a part of his lifestyle.

Comments

Latest