Facebook recognizes its error: "We do things wrong with the moderation of content"

by - 3:51 AM


Facebook recognizes its error: "We do things wrong with the moderation of content"



The head of the company's Policy Management explains in The Guardian that finding balance on the platform is complicated



This week, the British newspaper The Guardian uncovered how Facebook reacted to sensitive content and how its moderators acted according to the guidelines set by the Palo Alto company (United States).

After the repercussion of the article, Monika Bickert, Head of Global Policy Management on Facebook, has answered in the same newspaper assuming her mistakes, "but we take our security role seriously," she explains.

"On a typical day, more than a billion people use Facebook, they share content in dozens of languages: all kinds of content, from photos and status updates to live videos, says Bickert.

An amount of content in which the moderators have just 10 seconds to decide if a content is objectionable or did not publish last Sunday The Guardian . The leaked documents revealed that last summer, moderators faced more than 4,500 damage reports in two weeks, and this year's statistics spoke of 5,400 in another two-week period.

The company led by Mark Zuckerberg announced in April that it was going to hire 3,000 new examiners to deal with this type of content, amid incidents such as murders, tortures or sexual assaults that were broadcast on the social network and that saw hundreds of people.

Bickert acknowledges: "We do things wrong and we are constantly working to make sure that these cases occur less and less often, and we put a lot of effort into trying to find the right answers, even when there are not any."

You May Also Like

0 comments