Brexit could make it easier to ensure online media companies take more legal responsibility for curbing “persistent, vile and shocking abuse” suffered by politicians and public figures, the official ethics watchdog has said.
The Committee on Standards in Public Life urged ministers to legislate to shift liability for illegal content on to social media and other internet companies to tackle an “intensely hostile online environment”.
Facebook, Twitter and Google “are not simply platforms for the content that others post” because they play a role in shaping what users see, and so “must take more responsibility for illegal material”.
Facebook and Google take more than £6bn a year of the £10bn spent on digital advertising in the UK as well as more than 80 per cent of all growth.
This has led news publishers to complain that they provide unfair competition by profiting from the distribution of news content without paying for it or taking any legal responsibility for it.
Earlier this year Press Gazette launched its Duopoly campaign to highlight concerns that the giants are destroying journalism by squeezing news publishers out of business.
The move by MPs comes as Facebook has said it will no longer record its non-US revenue through Ireland in a major change to where it pays tax.
From next year taxes will be paid in the country where advertising profits are earned, it said.
The MPs said the pair are not currently liable “largely” due to a European Union directive which treats them as “hosts” of online content, but Theresa May’s commitment to leaving the single market means the Government can introduce new laws to make companies responsible, the watchdog said.
The report on intimidation in public life, commissioned by the Prime Minister, said social media was “the most significant factor” driving harassment, abuse and intimidation of 2017 General Election candidates, which included threats of violence and sexual violence, as well as damage to property.
“Some have felt the need to disengage entirely from social media because of the abuse they face, and it has put off others who may wish to stand for public office,” the report said.
“Not enough has been done. The committee is deeply concerned about the limited engagement of the social media companies in tackling these issues.”
Committee chairman Lord Bew said the “increasing scale and intensity of this issue demands a serious response”.
“We are not alone in believing that more must be done to combat online behaviour in particular and we have been persuaded that the time has come for the Government to legislate to shift the liability for illegal content online towards social media companies,” he said.
The committee was also “deeply concerned” about the failure of Facebook, Twitter and Google to collect data on their processes for reporting and taking down illegal content.
“Their lack of transparency is part of the problem,” the report said.
“None of these companies would tell us if they collect this data, and do not set targets for the time taken for reported content to be taken off the platform. This seems extraordinary when their business is data-driven in all other aspects.”
The committee urged online companies to put in place automated techniques to identify intimidatory content, while the Government should set up a “trusted flagger” social media reporting team during general elections so abuse and intimidation could be dealt with more quickly.
Lord Bew said: “This level of vile and threatening behaviour, albeit by a minority of people, against those standing for public office is unacceptable in a healthy democracy.
“We cannot get to a point where people are put off standing, retreat from debate, and even fear for their lives as a result of their engagement in politics.
“This is not about protecting elites or stifling debate, it is about ensuring we have a vigorous democracy in which participants engage in a responsible way which recognises others’ rights to participate and to hold different points of view.”
The report also called for “greater energy and action” from political parties, Parliament, the police and traditional media, as well as MPs and candidates themselves, warning: “This is all the more important in the light of recent allegations of sexual harassment and bullying in Parliament which will have shaken public confidence in politicians.”
Among other recommendations, ministers should consider introducing a new offence in electoral law of intimidating parliamentary candidates and party campaigners.
And police forces should be given better guidance and training on the context in which MPs and candidates operate and the nature of social media technologies.
The committee said leaders of political parties should “always call out intimidatory behaviour” even if it is perpetrated by those on the fringes of their party.
Parties must also produce joint codes of conduct on abuse and intimidation during election campaigns by December 2018, which will be jointly enforced.
The committee said it was also concerned about the impact on the diversity of a representative democracy and said parties have an “important responsibility” to support female, black and minority ethnic and LGBT candidates.
Twitter’s UK head of public policy Nick Pickles said: “Abuse and harassment – no matter the victim – have no place on Twitter.
“As the report notes, our team uses technology to proactively find abusive content and provides users with a single report that they can email to the police.”
Action was being taken on 10 times the number of accounts each day compared with last year, with restrictions or suspensions placed on “thousands more” abusive accounts.
“We remain committed to playing our part in the electoral process and working with political parties to support candidates, as well as working with the police and parliamentary authorities to facilitate their vital work,” Pickles added.
ends
Email pged@pressgazette.co.uk to point out mistakes, provide story tips or send in a letter for publication on our "Letters Page" blog