The latest ‘enforcement report’ has been published by social media giant Facebook, which highlights the number of posts and accounts the platform acted against during the period between October 2018 and March 2019.

You might recall that in the news not long ago it was mentioned that a new regulator is in the pipeline who will monitor social media giants such as Facebook, Twitter & Instagram, along with websites such as YouTube. The call for this new regulator came as rising numbers of fake news stories and inappropriate and illegal content was reported as circulating on social media and streaming sites.

Facebook has reported that during the 6-month period mentioned, a record number of over three billion fake accounts were removed from the platform, along with over seven million ‘hate speech’ posts.

It has been mentioned that there is a possibility that the size of Facebook makes it impossible to monitor against all posts and content, however, Chief Executive, Mark Zuckerberg has responded to calls for the platform to be broken up.

Last week, he said “I don’t that the remedy of breaking up the company is going to address the problem. The success of the company has allowed us to fund these efforts at a massive level. I think the amount of our budget that goes toward our safety systems, I believe is greater than Twitter’s whole revenue this year.”

Facebook has also announced that it will now be reporting on the number of posts that have been removed due to selling ‘regulated goods’ – things such as guns and drugs. During the 6-month period, over one million posts had action taken against them because they were selling guns.

The platform has estimated how often inappropriate and disturbing content was seen by users. This content includes things such as violence, terrorist propaganda and child sex abuse imagery, and the report stated that for every 10,000 pieces of content viewed by users, less than 14 people witnessed nudity, approximately 25 people saw graphic or violent content, and under 3 people saw child abuse images or terrorist propaganda. Facebook also stated that around 5% of the monthly active users on the platform were fake accounts.

With plans for a social media regulator still being discussed, authorities and the public are encouraging social giants such as Facebook not to wait for a new regulator to be put in place, but to step up their control over their platform now. Parents of children witnessing inappropriate content or being subjected to dangerous online hackers have said that social media networks need to do more in order to protect their children whilst they are online.