Facebook is responding to consumers asking the company to be more transparent about the inner workings of its products. The social media giant has had “Community Standards” that govern which content is allowed on Facebook, and what gets taken down. It was not until three weeks ago that Facebook publicly released the internal enforcement guidelines related to those standards.
Today Facebook released a preliminary “Community Standards Enforcement Report” that explains which content is allowed and not allowed on the site. It breaks down the prohibited content into the following categories: Graphic Violence, Adult Nudity and Sexual Activity, Terrorist Propoganda, Hate Speech, Spam, and Fake Accounts.
The report explains that Facebook finds and takes action on more than 90% of violating content before users flag it themselves. Because the latest report is still preliminary it does not include exact statistics, but explains that the “metrics are in development”.
Mark Zuckerberg discussed a lot of what is in this latest report at the F8 developers conference earlier this month. One of the key takeaways is that artificial intelligence will play a major role in cleaning up Facebook, but the technology is still in its early phases.
Read the full post here: Facebook Publishes Enforcement Numbers for the First Time