Access Now, Civil Liberties Union for Europe (Liberties), and European Digital Rights (EDRi) published a joint report discussing the Report of the High Level Expert Group (HLEG) on Fake News and Online Disinformation and related policy documents.
In Section 1, they discuss the HLEG’s definition of disinformation, which includes “all forms of false, inaccurate, or misleading information designed, presented and promoted to intentionally cause public harm or for profit”, excluding online illegal content and satire/parody.
They criticise it for various reasons:
- it is too broad;
- false, inaccurate, and misleading are alternative criteria ("or"), but they should be conjunctive criteria ("and");
- offline illegal content should also be excluded;
- the intention of profit is a risky element;
- the necessity to explicitly exclude satire and parody shows the weakness of the definition.
They would thus rewrite it this way: "Disinformation includes false, inaccurate, and misleading information designed, presented and promoted intentionally in ways that cause demonstrable and significant public harm. It does not cover issues arising from the creation and dissemination online of illegal content".
They argue that measures should be based on more research and benchmarking.
In Section 2, they criticise some proposed solutions: fact-checking, artificial intelligence, EU vs Disinfo, and limiting anonymity.
Fact-checking is a questionable solution, especially if it is institutionalised and not independent. There are also dangers in involving online platforms, as they would become arbiters of truth.
Artificial intelligence and other emerging technologies are inadequate solutions and risk infringing human rights.
The EU vs Disinfo project, run by the East StratCom Task Force, collects examples of Russian disinformation, without a clear audience and clear purpose. It should be reconsidered.
Anonimity and pseudonymity are sometimes considered part of the problem, but the right to privacy and freedom of expression should not be undermined.
In Section 3, they discuss meaningful solutions: addressing the online manipulation business model, preventing the misuse of personal data in elections, and promoting media and information literacy.
The online manipulation business model could be addressed by firmly observing data protection and privacy legislation, and thus reconfiguring the priorities of online companies.
The misuse of personal data in elections in particular should be prevented, thus reducing the effectiveness of micro-targeting campaigns.
Media and information literacy is important and should address not only young people, but also adults. Tags: