Britain’s ‘duty of care’ regulator will hurt us, social networks warn

Google, Facebook and Twitter said the UK’s new rules were a potential threat

Three of Silicon Valley’s biggest companies have warned investors that Britain’s plan for a new social media regulator could hit profits because of extra fines and compliance costs.

Google, Facebook and Twitter have all identified the UK’s new rules as a potential threat in their 2019 annual reports, citing new penalties for failing to remove harmful content as a risk to profitability.

The plans, announced last year after a sustained campaign by The Telegraph, would give social media companies a legal duty of care over their users and hand fresh powers to a regulator to make executives personally liable if they fail to take “reasonable” steps to protect them from harm.

Facebook’s annual report said: ”Such legislation has in the past, and may in the future, require us to change our products or business practices, increase our compliance costs, or otherwise impact our operations or our ability to provide services in certain geographies.”

It also mentioned similar steps being taken by authorities in Australia, France, Singapore as well as the UK as drafting “legislation imposing penalties for failure to remove content or follow certain processes”.

Twitter and Google, the owner of YouTube, used similar language, saying British laws could “adversely affect” their business and financial results.

Google also cited the risk of new German laws which could cost them “significant” fines in future.

The UK’s plans are part of a wave of increased scrutiny and regulation across the world which has forced technology companies to invest far more on content moderation than ever before.

They come after intense criticism about the use of social media sites to spread disinformation, extremist or paedophile material.

The use of Facebook to live-stream the killing of 51 worshippers at a mosque in Christchurch, New Zealand, sparked an outcry last May.

Last month, about $40bn (£30.5bn) was wiped from the market value of Facebook after the company revealed that its spending on day to day operations had almost doubled year on year.

Stricter rules could force technology firms to hire and train extra moderators, legal experts, build new technology to detect and remove harmful content and create new internal software for moderators to use.

When Germany passed a similar law in 2017, both Facebook and Twitter had to increase their staffing levels to manage it. Facebook was later fined €2m (£1.7m) by the German justice ministry for making its complaints system too hard to use.

Kate Klonick, a law professor at St John’s University in New York City who has spent time embedded with Facebook’s moderators, said the cost of complying with UK law alone would be “not insignificant” because the country is a “high value customer”.

The requirement to comply with other laws in different countries is likely to further add to the cost.

“It’s making the companies nervous,” she said. “I think they’re looking down the road and saying, ‘if we give in on this, then we have no justification for not giving in when Pakistan does the exact same thing'.”

“When you put monetary policies in place, you could incentivise a company to over-censor by penalising them for not taking things down; or, if they get penalised for taking something down, they'll over-correct and leave up a lot of harmful content.”

She said that very harsh fines that are levied often could become a de factor tax, and that high enough penalties might lead tech firms to stop operating in a country, although she added that this would be unlikely for a market of the UK's size and wealth.

Jennifer Grygiel, a communications professor at Syracuse University in New York state who studies social networks, said that US tech giants had been able to "skirt a large portion of their operating budget" by failing to fully moderate their services and that making them fully safe could require huge investments.

They said: "Honestly, they probably aren't disclosing enough risk. Manpower is one of the largest pieces of overhead for any type of company... I still don't think that we've seen adequate numbers on content moderation.

"When you look at the headcount, it's not surprising that the UK is following countries like Germany that know that they need to essentially legislate the essentially accountability into these corporations.

"They are not properly motivated to provide the very infrastructure that is needed to make sure that these are safe communication platforms and ethical ones. There should be more public outcry about this."

Asked for Facebook’s view on the Government’s plans, Rebecca Stimson, its UK head of public policy, said: “Facebook has long called for new regulations to set high standards across the internet. New rules are needed so that we have a more common approach across platforms and private companies aren’t making so many important decisions alone.

She added that Facebook takes online safety “extremely seriously” and now employs 35,000 workers out of 45,000, and said the company is looking forward to “carrying on the discussion” with Parliament and the Government.

Google did not respond to questions about its stance, while Twitter deferred to the Internet Association, a trade group which represents all three companies.

Daniel Dyball, the association’s UK executive director, said: “Internet companies are committed to working with government to achieve our shared goals of keeping people safe online and ensuring that the internet continues to deliver benefits to the economy and society.

”We will continue our constructive engagement on the issues of concern that are still under review – for example the scope of regulation, treatment of legal but potentially harmful content, and enforcement powers – and look forward to the full consultation response in the spring.”

License this content