UK gov threatens Facebook with fines if they don’t take down extremist material

social media website icons

The U.K. government threatens to fine social media companies if they don’t remove extremist materials quickly when it is reported to them

A recent UK government report by the home affairs select committee has critisied the top social media companies such as Twitter, Facebook and Google for not responding quick enough to requests to take down extremist terrorism or hate speech or child sexual abuse images.

The report suggests that social media companies should be forced to pay towards the policing of content on the sites and that it is unacceptable that the companies expect users to report content at zero cost to the firms. It also suggested meaningful fines for when content is reported to them but not removed within a strict timeframe. This follows a similar suggestion from Germany which could see companies fined upto 52 million euro.

Chairwoman for the committee Yvette Cooper said “Social media companies’ failure to deal with illegal and dangerous material online is a disgrace” The committee quoted samples of anti-semitic and anti Islamic groups on Facebook not being removed because Facebook said they were not hate-crime. And anti-semitic videos on Youtube being allowed to remain because they did not break the Youtube T&Cs.

The UK charity the NSPCC has also called for fines for social media companies that fail to protect children, saying their child protection measures were not up to scratch.

The UK government seem to forget that these social networks do not just post content that is written by and for a British audience and certainly in the USA you can say a lot of things which would be illegal to post in Great Britian. The British police already have the power if they want to to block access to content that is illegal in the UK so they could use this as a method to remove access to offending material on social networking sites that violate UK law but not necessarily go against the Facebook T&Cs.

If there is a threat of fines for not removing content quick enough the social media companies will simply make the removal automated so that content is removed as soon as its reported to avoid the threat of fines. This will result in a lot of legitimate content removed because no one is able to check it legitimacy.

The arguments from the UK government is that the social media sites make a lot of money so can afford to do more about policing the content, but if they were to bring in a new law surely it would not only affect the big player like Facebook and Twitter but the smaller social networks which don’t have the budgets of the large multinationals and couldn’t afford the extra staff that would be needed to moderate the vast amount of user generated content posted online every day. There are also some social networks which operate entirely from overseas and so are out of the UK jurisdiction so would not have to comply with any time scale rules that may be brought in.

The comments from the NSPCC also seem unworkable, lets say for an example the NSPCC would want fines for social media sites if they allow over 18s to contact a child. There is no way that the sites can verify the identity of the hundred of millions of users active on the site so the social media sites will simply change their terms to say you have to be over 18 to join. a social media website.

While there is no doubt that there are problems with some of the content posted onto social media I think having the users report it and then it being checked by a moderator is the only way it can be done and still be a usable platform.