Publication Date: July 2018
Research and Editorial Team: The Digital, Culture, Media and Sport Committee of the House of Commons

The term "fake news" has been widely used since 2016, but it has no agreed definition. Its has been used for fabricated, manipulated, imposter, or misleading content, for “false context of connection” (e.g. article headlines that do not refer the content), for satire or parody, and even (notably by Donald Trump) for content that is disliked or disagreed. The Committee thus recommends the British government to reject the term “fake news”, and suggests instead an agreed definition of the words “misinformation” and “disinformation”.

Content standards, including rules relating to accuracy and impartiality, should be set for online content, based on those existing for television and radio. More research on misinformation and disinformation is needed. Fact-checking tools, bases on shared standards, should be developed to allow people to see the level of verification of a website at a glance. Extensive cooperation is needed to address the problem.

Online platforms claims they are neutral, but -while they may not produce content themselves- by defining the algorithm they decide what we see in our news feed, so they are not. The binary choice between platform and publisher does not work well with tech companies like Facebook: the Committee recommends that a new category should be created. Their business model questions also the competition law.

Facebook should take responsibility for the way the platform is used. “Free Basic”, a partnership between Facebook and other companies granting free access to certain internet services in developing countries, has contributed to inciting hatred against the Rohingya Muslim minority in Burma. In Germany, since January 2018, the Network Enforcement Act (NetzDG) obliges tech companies to remove hate speech within 24 hours: the provision has been criticised, but as a result one in six Facebook moderators now works in Germany. The Committee criticises Facebook for its lack of transparency: the algorithm should be audited and the number of fake accounts should be determined. It should be easier for users to control and protect their data. 

Companies like Cambridge Analytica and SCL Group have misused data from Facebook, assessing personality from Facebook likes and surveys for political microtargeting. While they have been suspended from Facebook and gone under administration, other companies are carrying out very similar work, sometimes with complex ownership structures and exactly the same individuals. SCL Elections has worked to influence elections all over the world (including Argentina, Malta, Nigeria, St Kitts and Nevis, Trinidad, and Tobago) using unethical (if not illegal) behaviour, at the same time working for the UK, US, and other allied governments.

The electoral law should be updated to account for online campaigning. The Electoral Commission recommends that online campaign material should legally be required to carry a digital imprint, identifying the source, as it is for physical material. The Committee recommends that the rules should include online adverts that use political terminology even if not sponsored by a specific political party. Microtargeted messages are not publicly viewable (they have been called "darks ads"). The Committee recommends the creation of a public register for political advertising. 

The British government has publicly accused Russia of meddling in elections and planting ‘fake news’ in an attempt to ‘weaponise information’ and sow discord in the West. E.g., Russia Today and Sputnik News (both funded by the Russian government) supported the campaign for the Brexit and Saint Petersburg-based Internet Research Agency (IRA) spent money in adverts in the UK. Disinformation can be considered as unconventional warfare. According to the Committee, Facebook has not done enough research on the problem: “there is a disconnect between the Government’s expressed concerns about foreign interference in elections, and tech companies intractability in recognising the issue”.

Arron Banks, co-founder and donor of the Leave.EU campaign, had meetings with Russian officials in the run up to the EU Referendum. He, together with Andy Wigmore (director of communications for Leave.EU campaign), is accused of having misled the Committee on the number of meetings that took place with the Russian Embassy and of having walked out of evidence sessions to avoid scrutiny of the content of the discussions with the Russian Embassy. Banks is the person who has done the largest donation to a political campaign in British history, but he could not give a clear answer about where the money for his donations came from. There is evidence he has discussed gold and diamond deals with Russian Embassy contacts.

There have also been allegations of Russian interference in the referendum on the independence of Catalonia in 2017, provoking conflict and spreading propaganda beneficial to those wanting independence from Spain.

Digital literacy should become a part of the school curriculum. There should also be a public awareness initiative about social media, protection of personal data, and online political campaigning.

Tags: Fake news and disinformation United Kingdom Russia Malta Spain Social media Privacy Big data Media funding Asia Americas

The content of this article can be used according to the terms of Creative Commons: Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) . To do so use the the wording "this article was originally published on the Resource Centre on Media Freedom in Europe" including a direct active link to the original article page.