The Digital, Culture, Media and Sport Committee has published its final report on Disinformation and 'fake news'. The report calls for:
Compulsory Code of Ethics for tech companies overseen by independent regulator
Regulator given powers to launch legal action against companies breaching code
reform current electoral communications laws and rules on overseas involvement in UK elections
Social media companies obliged to take down known sources of harmful content, including proven sources of disinformation
Further finds that:
Damian Collins MP, Chair of the DCMS Committee said:
"Our inquiry over the last year has identified three big threats to our society. The
challenge for the year ahead is to start to fix them; we cannot delay any longer.
"Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised 'dark adverts' from
unidentifiable sources, delivered through the major social media platforms we use everyday. Much of this is directed from agencies working in foreign countries, including Russia.
"The big tech companies are failing in the
duty of care they owe to their users to act against harmful content, and to respect their data privacy rights.
"Companies like Facebook exercise massive market power which enables them to make money by bullying the smaller
technology companies and developers who rely on this platform to reach their customers.
"These are issues that the major tech companies are well aware of, yet continually fail to address. The guiding principle of the 'move
fast and break things' culture often seems to be that it is better to apologise than ask permission.
"We need a radical shift in the balance of power between the platforms and the people. The age of inadequate self regulation
must come to an end. The rights of the citizen need to be established in statute, by requiring the tech companies to adhere to a code of conduct written into law by Parliament, and overseen by an independent regulator.
also have to accept that our electoral regulations are hopelessly out of date for the internet age. We need reform so that the same principles of transparency of political communications apply online, just as they do in the real world. More needs to be
done to require major donors to clearly establish the source of their funds.
"Much of the evidence we have scrutinised during our inquiry has focused on the business practices of Facebook; before, during and after the
Cambridge Analytica data breach scandal.
"We believe that in its evidence to the Committee Facebook has often deliberately sought to frustrate our work, by giving incomplete, disingenuous and at times misleading answers to
"Even if Mark Zuckerberg doesn't believe he is accountable to the UK Parliament, he is to the billions of Facebook users across the world. Evidence uncovered by my Committee shows he still has questions to
answer yet he's continued to duck them, refusing to respond to our invitations directly or sending representatives who don't have the right information. Mark Zuckerberg continually fails to show the levels of leadership and personal responsibility that
should be expected from someone who sits at the top of one of the world's biggest companies.
"We also repeat our call to the Government to make a statement about how many investigations are currently being carried out into
Russian interference in UK politics. We want to find out what was the impact of disinformation and voter manipulation on past elections including the UK Referendum in 2016 and are calling on the Government to launch an independent investigation."
This Final Report on Disinformation and 'Fake News' repeats a number of recommendations from the interim report published last summer. The Committee calls for the Government
to reconsider a number of recommendations to which it did not respond and to include concrete proposals for action in its forthcoming White Paper on online harms.
Independent regulation of social media companies.
repeats a recommendation from the Interim Report for clear legal liabilities to be established for tech companies to act against harmful or illegal content on their sites, and the report calls for a compulsory Code of Ethics defining what constitutes
harmful content. An independent regulator should be responsible for monitoring tech companies, backed by statutory powers to launch legal action against companies in breach of the code.
Companies failing obligations on harmful or
illegal content would face hefty fines. MPs conclude: "Social media companies cannot hide behind the claim of being merely a 'platform' and maintain that they have no responsibility themselves in regulating the content of their sites."
The Report's recommendation chimes with recent statements by Ministers indicating the Government is prepared to regulate social media companies following the death of teenager Molly Russell. The Committee hopes to see firm
recommendations for legislation in the White Paper to create a regulatory system for online content that is as effective as that for offline content.
It repeats its recommendation for new independent regulation to be funded by a
levy on tech companies operating in the UK.
Data use and data targeting
The Report highlights Facebook documents obtained by the Committee and published in December 2018 relating to a Californian
court case brought by app developer Six4Three. Through scrutiny of internal Facebook emails between 2011 and 2015, the Report finds evidence to indicate that the company was willing to: override its users' privacy settings in order to transfer data to
some app developers; to charge high prices in advertising to some developers, for the exchange of data, and starve some developers--such as Six4Three--of that data, contributing to them losing their business. MPs conclude: "It is evident that
Facebook intentionally and knowingly violated both data privacy and anti-competition laws."
It recommends that the ICO carries out a detailed investigation into the practices of the Facebook platform, its use of users' and
users' friends' data, and the use of 'reciprocity' of the sharing of data. The CMA (Competition and Markets Authority) should conduct a comprehensive audit of the advertising market on social media and investigate whether Facebook has been involved in
MPs note that Facebook, in particular, is unwilling to be accountable to regulators around the world: "By choosing not to appear before the Committee and by choosing not to respond personally to
any of our invitations, Mark Zuckerberg has shown contempt towards both our Committee and the 'International Grand Committee' involving members from nine legislators from around the world."