This article is a follow up to EYE online and is inspired by the activity "Dealing with news: How do we know what is true in this chaotic mediascape?". EYE online aimed to compensate for the postponement of European Youth Event 2020 by proposing online activities to young Europeans in the framework of the EuropeansAgainstCovid19 EU campaign. All the activities of EYE online can be watched here.
Fifty years ago this year, Alexandr Solzhenitsyn received the Nobel Prize for Literature. In his acceptance speech, delivered by letter from the stultified USSR, he railed against omnipresent falsehoods and the ‘fog of lies’ that led, he argued, to an inertia in which societies could no longer discern truth from fiction.
A hundred years ago next year, CP Scott, the legendary editor of the renowned British newspaper The Guardian, set out in an essay the notion that can be considered the defining core of journalism: ‘comment is free, but facts are sacred.’
Separating fact from falsehood has, for all people at all times, been a vital part of life. Neither community nor decision-making can be present without a consensus on the nature and location of truth. Online networks, however, permit the creation of a flood of false information, overwhelming our ability to find the truth, maintain accountability and safeguard our democratic societies. The 2016 American elections brought this home, and while they were far from the first to be tossed in a sea of dubious online ‘facts’ -the LSE academic Peter Pomerantsev argues that the current situation has spread from Russia since the 1990s- urgent action must be taken if democracy is to be upheld. This has been particularly the case during the Covid-19 crisis.
Discussion at the EYE: What can be done?
When the European Youth Event was moved online for 2020 due to the coronavirus restrictions, disinformation was naturally included among the subjects to be discussed. On the panel for the discussion, which took place on April 16, were Juliane von Reppert-Bismarck, the founder of the non-profit organisation Lie Detectors; Jesse Evers, of the Dutch anti-disinformation organisation DROG; and Irena Joveva, previously a journalist in Slovenia for eight years and now an MEP serving on the Committee on Culture and Education.
The discussion focused in large part on the role of media literacy, key in combating the spread of online disinformation. Juliane von Reppert-Bismark, whose organisation specialises in teaching children, in particular, to evaluate news sources, sums up the benefits of the approach very pithily:
it works by ‘prebunking, rather than debunking.’
Juliane von Reppert - Bismark
People resist pressure to change their mind, meaning that it is hard, even with evidence, to persuade someone to accept that a piece of disinformation they had believed in is in fact wrong. Nobody wants to admit to being duped. Teaching people how to evaluate a piece of information before they are exposed to it, by contrast, works well. Jesse Evers of DROG agrees with her: the questions people need to be taught to ask, he suggests, concern ‘the intent of the message’: ‘Is the intent to instill fear, or to polarise? Where is this source coming from? Can I clarify, can I find other sources?’
If young people can be taught to carefully consider the identity and intentions of news stories’ writers, they are less likely to accept and spread further fake information. This can have an impact beyond the individuals taught by media literacy-building organisations like Lie Detectors. The word ‘virality’, when speaking of online spread, is aptly chosen: just as individuals choosing to stay at home can slow the spread of Covid-19 by cutting contacts within the network of infection, so individuals questioning news stories and deciding not to pass them on can reduce the spread of fake news. It’s no surprise that von Reppert-Bismark is calling for media literacy training to be included in school curricula.
Much can be done at the policy level, but there are difficult questions to navigate in order to do so. Where does the line lie between under-informed but legitimate expression of opinion, and harmful disinformation? Whose job should it be to decide? If it’s an EU body, as both Julianne von Reppert-Bismarck and Jesse Evers note during the discussion, there will be painful questions of personal liberty within the EU, and knock-on effects outside. Liberties are a key element of the discussion, but Evers suggests that disinformation-producers will ‘have won [the conversation] if it becomes a discussion on the freedom of speech’. ‘In the end,’ he argues, ‘disinformation is not, or is not only, a discussion on the freedom of speech’: speech rights cannot be permitted to give carte blanche to those hoping to dishonestly sway opinion. Reppert-Bismark observes that outside the EU there are many worried that the EU will choose to ‘lead the way’ by creating legislation to crack down on disinformation efforts, in turn justifying repression of dissenting voices in less liberal countries.
One possible alternative, she suggests, could be the use of antitrust law against the large online platforms. The online giants would be considered to be public utilities, with possible natural monopolies to be either broken up for public good, or regulated as more than simple companies. Facebook is already pre-empting this to some extent with the recent creation -still awaiting implementation- of an Oversight Board intended to give independent judgments on content decisions. Among its members are the former prime minister of Denmark Helle Thorning-Schmidt and Nobel peace laureate Tawakkol Karman; but as Reuters Institute chair Alan Rusbridger, another member of the board, noted in a blog post, it is at risk of being perceived as a fig leaf. Given the continued absence of extensive legislation on the thorny issue, however, and with few people keen for the social media platforms’ power to be further enhanced by the allocation to them of a formal role of freedom of speech adjudicator, an independent board is an important intermediary step.
What is the EU currently doing, and what is it planning to do?
At the start of June the European Digital Media Observatory (EDMO), a pan-Europe effort led by the European University Institute in Florence, was opened, a rapid turnaround following a call for proposals in October 2019 showing how seriously the matter is taken. The initiative will create an online platform supporting a multidisciplinary group of researchers, fact-checkers, and other contributors to the effort against online disinformation, carrying out investigations into the methods used by disinformation-spreading actors in order to understand how to reduce their effects. This is a major step: the EDMO will be a central hub in a new field, and will soon be supported by national and regional clusters of researchers supported by a EUR 9 million grant to be launched later this year.
The creation of the EDMO continues a wide-ranging anti-disinformation approach developed following lengthy discussions in late 2018. By June 2019 a world-leading code of practice had been agreed, now adhered to by a long list of technology companies including Google, Facebook, Twitter and TikTok. The result has been billions of actions taken against disinformation and policy breaches. There is still more to be done. One key step, proposed by academics such as Pomerantsev and envisaged by the EU, will be to allow all users to see who, or what, is sharing the things they see online: whether they are a bot rather than a real person, for instance, and whether they’re affiliated with a particular political party, government, or PR organisation. The lesson of media literacy classes -the quick assessment of online writers’ trustworthiness and intentions- will then be far easier to act upon.
In the meantime, as the panellists at the EYE discussion were quick to emphasise, individuals can check things that seem suspicious thanks to a host of fact-checking organisations, easily searchable in their. Most people already find themselves having to do this often: the most recent Eurobarometer, the annual EU-wide poll of citizens’ experiences and attitudes, found that 71% of respondents encountered news they recognised as fake at least several times a month. They can also, as UNESCO suggests, join the fact-checking movement themselves.
This topic was discussed in the EYE Online session 'Dealing with news: How do we know what is true in this chaotic mediascape?'. Hosted by Niko Efstathiou, the session featured three speakers: Irena Joveva, of the European Parliament; Jesse Evers of DROG; and Juliane von Reppert-Bismarck, the founder of Lie Detectors.
The European Parliament's support for the production of this publication does not constitute an endorsement of the contents, which reflect the views only of the authors, and the Parliament cannot be held responsible for any use which may be made of the information contained therein.