Afternoon Voice Exit Reader Mode

Facebook adds more human moderators to keep your posts in check

AP Photo/dapd, Joerg Koch, File

In its latest blog post, Facebook revealed that it has added more human moderators to review content for hate speech.

Facebook is doubling the team size to 7,500 content reviewers to scan the thousands of posts shared on the platform each minute, Engadget reported.

These content reviewers undergo weeks of intensive training to understand the process related to content reviewing. Facebook also uses artificial intelligence to tackle unwanted content on its platform.