This week, Facebook has released its community standard enforcement reports that focuses on all the content that was removed, and the actions the platform took on the material users have posted on Facebook that broke the rules established by Facebook.
Facebook claims that they have improved how these content, specifically on drugs and firearms, can be detected as soon as possible, which allows them to remove this type of content as soon as possible.
Facebook also claimed they improved their capacity to detect and remove the content related to child exploitation and nudity on the platform. A statement from Facebook:
“In Q3 2019, we removed about 11.6 million pieces of content, up from Q1 2019 when we removed about 5.8 million. Over the last four quarters, we proactively detected over 99% of the content we remove for violating this policy.“
The platform also added that they have been successful in removing the content related to self-harm & suicide. Instagram has focused on this topic a lot since last month which actually makes a lot of sense since Instagram is highly popular amongst the younger generation. Which is why Instagram announced they’ll ban all images that depict self-harm on their platform.