Facebook, which has been trying to take transparency steps for a while, has released its new Community Standards Exercise report. The report shows that the company's algorithms and moderators have managed to capture numerous violations such as hate speech, online bullying and harassment before complaints.
According to the report, the company is able to implement Facebook standards in many cases and can automatically capture 90 to 99 percent of community violations. But when it comes to bullying, Facebook can only capture 14 percent of the 2.6 million reported harassment incidents. In the hate speech, let us point out that the company marked 65.4 percent of the 4 million content reported.
Counterfeit accounts stand out as the most widely breached community standard. Facebook, the first quarter of this year, 2.19 billion and found that the counterfeit account is deleted. The company's spam content in the first quarter is around 1.76 billion.
Looking at the content of child pornography and violence, we see that the company removed 5.4 million content and 33.6 million violent content including child nudity or sexuality.
Facebook has included the content of a panel discussion at Yale Law School on Facebook moderation along with the report. The company, which mentions the proposals presented in the panel, can implement these suggestions in the future. Recommendations include introducing new metrics that support community-standard exercise, enabling Facebook users to become more aware of policy changes, and identifying users' disagreements with Facebook.