Facebook Acknowledges Hate Speech Moderation Technology Lagging
Facebook content moderation technology for hate speech is lagging compared with systems for flagging adult and violent content, the company said Tuesday. Facebook took down 21 million pieces of adult content in Q1, took down or applied warnings to about…
Sign up for a free preview to unlock the rest of this article
Communications Daily is required reading for senior executives at top telecom corporations, law firms, lobbying organizations, associations and government agencies (including the FCC). Join them today!
3.5 million pieces of violent content and removed 2.5 million pieces of hate speech. Only about 38 percent of hate speech was flagged by Facebook technology, the platform said. Its technology identified about 96 percent of adult content before it was reported and 86 percent of violent content.