British politicians have called for new legislation that will force social media firms to fulfil a duty of care to their users.
div > div.group > p:first-child”>
Parliament’s Science and Technology Committee published a report Thursday that said social media companies must be legally required to protect users’ health and wellbeing, emphasizing a lack of regulation around harmful content and online bullying.
Legislation recommended in the report included forced transparency reporting, solid processes for dealing with alerts to harmful content, and the use of technology such as artificial intelligence to identify harmful content and behaviors.
Lawmakers also called for a new regulator to scrutinize firms’ efforts to minimize harm caused by their platforms. The regulator would be able to take enforcement action against companies that failed to comply with the code of practice, which would be underpinned by a “strong sanctions regime.”
One of the recommended sanctions was making company directors financially liable for breaches of the legislation.
Evidence received by the committee linked social media exposure to health issues including damaged sleep patterns, poor body image, bullying, and grooming.
Norman Lamb, chair of the Science and Technology Committee, said in a press release: “The government must act to put an end to the current approach to regulation. We must see an independent, statutory regulator established as soon as possible, one which has the full support of the government to take strong and effective actions against companies who do not comply.”
Daniel Dyball, executive director of the Internet Association U.K. — of which Google and Facebook are members — told CNBC via email that the industry recognized more needed to be done to make services safe for all users.
“Our members employ content reviewers and deploy state of the art technology to identify and remove inappropriate content, often before anyone else sees it. We’re continuing to work with the government on its forthcoming White Paper on internet safety, which we believe can be an important step forward,” he said.
A Twitter spokesperson said in a statement emailed to CNBC: “Improving the health of the public conversation online remains our number one priority. In 2018 alone, we introduced more than 70 changes to product, policy and processes to achieve a healthier, safer Twitter.”
A spokesperson for the U.K. Department of Culture, Media and Sport told CNBC via email: “We have heard calls for an internet regulator and to place a statutory ‘duty of care’ on platforms and are seriously considering all options. Social media companies clearly need to do more to ensure they are not promoting harmful content to vulnerable people.”
They added that the department would be drawing up a White Paper — a proposal for legislation — that would set out what social media firms’ responsibilities should be and what should happen if they are not met.
U.K. children’s charity the NSPCC is campaigning for tough regulation and wants to see companies made to publish reports every six months outlining the scale of risks on their sites. British newspaper The Telegraph is also campaigning to impose a duty of care on social media businesses.
The U.K.’s Royal Society for Public Health said in a statement on Thursday that it “fully supported calls for a statutory code of practice underpinned by a regulatory regime.”
However, Antony Walker, deputy CEO of trade association TechUK, told CNBC via email that it would be difficult to find a solution that addressed risks without being too restrictive.
“In its widest form a duty of care could require platforms to monitor all speech on their platforms in breach of other fundamental rights,” he said on Thursday. “Solutions must be found that are effective and proportionate, taking into account the very real differences between content that is illegal and content that is legal but might be harmful to some people in some contexts.”