Skip Navigation
France

Forum to discuss combatting disinformation (FR0106)

Overview

At-a-Glance

Action Plan: France Action Plan 2021-2023

Action Plan Cycle: 2021

Status:

Institutions

Lead Institution: French Audiovisual Board (CSA)

Support Institution(s):

Policy Areas

Digital Governance, Disinformation/Misinformation, Public Participation

IRM Review

IRM Report: France Action Plan Review 2021-2023

Early Results: Pending IRM Review

Design i

Verifiable: Yes

Relevant to OGP Values: No

Ambition (see definition): Low

Implementation i

Completion: Pending IRM Review

Description

What is the public problem that the commitment will address? The opportunities that the digital information space offers to citizens in terms of expression, information and exchange of skills and knowledge come with risks, including the intentional or inadvertent spread of misinformation. Misinformation that is likely to disturb public order or elections represents, in particular, a critical social and democratic issue in that it can jeopardise the health and safety of citizens (e.g. COVID-19 misinformation) or influence how they choose to vote (e.g. misinformation disseminated with the intention of manipulating voters and influencing the outcome of an election). Economic misinformation can also affect the smooth functioning of society. The presence of misinformation online is a major problem in itself; however, the risks it represents are likely to be significantly greater when it is disseminated widely and quickly (even if findings differ on its actual impact). The proliferation of misinformation has particularly been made possible by online platforms, especially social media sites, which allow any user to share and rapidly disseminate content to a potentially huge audience. Social media users come across disinformation and may even have the impression that this type of content flourishes particularly easily on such platforms. Nevertheless, the reality of its overrepresentation and greater virality when compared to other content – an issue debated in existing literature – should be explored in order to French Audiovisual Board (CSA) understand and to objectively substantiate the causes of this observation and, where appropriate, this reality, [by] taking into account differences in each platform’s model. Likewise, the factors leading to this phenomenon should be clearly identified and better understood, as a whole and in their diversity, by considering the role of different players (e.g. operators, audiences, influential users, economic and political stakeholders, etc.).

What is the commitment? With the adoption of Act 2018-1202 of 22 December 2018 on combatting the manipulation of information, major online platforms must cooperate in the fight against misinformation, which involves committing resources to this end. The Act gives the CSA authority to monitor compliance with the obligation, to issue recommendations for the platforms in question and to oversee the implementation of resources by ensuring their actual use and effectiveness. The CSA does not play a role in countering disinformation activities on online platforms. However, to assess the measures put in place, it must become familiar with, characterise and understand disinformation activities, particularly their virality. To that end, it has resources in its capacity as a regulator (resources assessment, dialogue with and requests directed at platforms, internal research and studies carried out by research institutions, which expand its knowledge of these phenomena). Given the complexity of the trend, due in large part to the variety of factors, networks and stakeholders involved, as well as the fields, skills and resources needed, it is unrealistic to try to solve the problem through a single player. This is compounded by the fact that the generic term “misinformation” encompasses a wide range of practices and content that does not fit neatly into one category and requires a distinction to be made. For example, the act of disseminating information of questionable accuracy is not to be lumped together with a deliberate attempt to mislead, even though these two realities may overlap. The virality of information can in itself be very different in nature and scale. These objectives can only be achieved through an array of discussions and initiatives involving stakeholders as well as the academic community and civil society, on an international scale. To kick-start these joint discussions, the CSA has suggested hosting and taking part in multilateral dialogue with civil society and research institutions, aimed at: - identifying lines of inquiry, hypotheses and research topics that should be explored further - identifying resources, barriers and constraints to factor in to this research (such as accessing and using data in compliance with personal data protection rules) and means of action - discussing how to define, characterise and objectify the phenomena of misinformation and disinformation, as well as their virality globally - understanding the factors contributing to this virality by looking at the problem from an international perspective - proposing solutions to combat the dissemination of misinformation/disinformation - identifying and sharing tools, resources and techniques developed by participants or third parties to study and/or counter these phenomena, and that help civil society and citizens to address these issues and use these tools, where applicable

How will the commitment contribute to solving the public problem? Initially, in the second half of 2021, the CSA will undertake a preliminary phase in which the problem will be defined and developed by identifying important questions, with the help of already completed work, a discussion paper and annotated research on existing literature. This document will provide a basis for discussion ahead of the launch of the dialogue during the first meeting of the forum. The CSA will be assisted in this endeavour by its Expert Committee on Online Disinformation. Concurrently, it will identify stakeholders from the academic community and civil society likely to be interested in participating in the forum. The second step will begin with the first meeting of the forum, set to take place in early 2022. This forum will hold regular meetings, perhaps on a twice-yearly basis, to take stock of the work and discussions completed by participants, and to compare approaches, the issues identified and findings, and characterise the constraints faced by working to come up with solutions. In addition, these meetings do not preclude – but instead encourage – the development of partnerships between participants on the basis of the research avenues identified during the forum. In the long term, other players could join this forum.

Why is this commitment relevant to OGP values? This commitment is intended to develop transparency, by helping the academic community and civil society to share the concerns and activities of their members. It is part of the CSA’s continuous efforts to improve the transparency of government data, such as reporting on its website the amount of time broadcast media dedicate to pluralism and producing research and analyses with the aim of increasing understanding of these phenomena and raising awareness among citizens. This commitment also has a citizen component in that it aims to improve the conditions of public debate in the digital information space by defining problems so as to better identify them and potentially solve them.

IRM Midterm Status Summary

Action Plan Review


Commitment 56. Fight disinformation

● Verifiable: Yes

● Does it have an open government lens? No

● Potential for results: Unclear


Commitments

Open Government Partnership