Media development action with informed and engaged societies
After nearly 28 years, The Communication Initiative (The CI) Global is entering a new chapter. Following a period of transition, the global website has been transferred to the University of the Witwatersrand (Wits) in South Africa, where it will be administered by the Social and Behaviour Change Communication Division. Wits' commitment to social change and justice makes it a trusted steward for The CI's legacy and future.
 
Co-founder Victoria Martin is pleased to see this work continue under Wits' leadership. Victoria knows that co-founder Warren Feek (1953–2024) would have felt deep pride in The CI Global's Africa-led direction.
 
We honour the team and partners who sustained The CI for decades. Meanwhile, La Iniciativa de Comunicación (CILA) continues independently at cila.comminitcila.com and is linked with The CI Global site.
Time to read
3 minutes
Read so far

How to End Infodemics: Working Group on Infodemics - Policy Framework

0 comments
Date
Summary

"These times show more than ever that information is power, and when lies spread faster than facts, all human endeavor is threatened. It's an existential moment for democracy and journalism. This is a concrete step forward to find systemic global solutions." - Maria Ressa

This report, published by the Forum on Information and Democracy, identifies structural challenges that contribute to the threat of misinformation and offers a wide list of recommendations on how to address a phenomenon that threatens democracies and human rights, including the right to health. These structural challenges are related to: the lack of transparency of online platforms; the way that content, including reliable news and information, is moderated on these platforms; and the virality of disinformation shared on private messaging apps.

As Maria Ressa, co-chair of the steering committee of the working group on infodemics, explains in the foreword, "Social media, once an enabler, is now the destroyer, building division - 'us against them' thinking - into the design of their platforms. It's not a coincidence that divisive leaders perform best on social media. Facebook is now the world's largest distributor of news. Except there's a catch: lies laced with anger and hate spread faster and further than the boring facts of news. They create a bandwagon effect of artificial consensus - for the lie. You repeat a lie a million times, it becomes a fact. Without facts, you can't have truth. Without truth, you can't have trust. Without these, democracy as we know it is dead."

Led by a coalition of media and civil society organisations, the Forum on Information and Democracy is a non-profit entity whose mandate is to implement democratic principles in the global information and communication space. The goal is to guarantee freedom of opinion and expression and to strengthen democracies, as outlined in the International Declaration on Information and Democracy. Launched in 2019, the Forum on Information and Democracy created a working group on infodemics in June 2020 to devise a "regulatory framework" to respond to the information chaos on online platforms and social media. This report is the result of five months of work by this working group on infodemics. It identifies four structural challenges and proposes concrete solutions for each of them. Each chapter explains the issue in detail, outlines steps that are required, and offers 250 recommendations for states, service providers and journalism communities. The issues fall into the following four categories:

  1. Platform transparency - Access to the qualitative and quantitative data of the leading digital platforms, as well as to their algorithms, is a prerequisite for evaluating them. Transparency requirements must, therefore, be imposed on such platforms in order to be able to determine whether they are respecting their responsibilities in these areas and, in general, with regard to their business models and algorithmic choices.
  2. Meta-regulation of content moderation - To safeguard the democratic value of online spaces to all, content moderation must be governed by a set of baseline principles (meta-regulation) that protect democratic values and uphold the human rights and dignity of all persons without discrimination. A human rights approach to content moderation would avoid haphazard decision-making by digital platforms and guard against arbitrary requests from states to remove content.
  3. Platform design and promotion of reliable news and information - The COVID-19 pandemic has demonstrated the need to reverse the escalation of sensational content and rumour by promoting reliable news and information in a structured manner. Mechanisms and policies for promoting authenticity, reliability, and findability of content are yet to be determined, based on established criteria.
  4. Mixed private and public spaces on closed messaging services - The virality of disinformation shared on messaging apps is reinforced by the use of groups that sometimes have thousands of members. It is important to define minimal rules for messaging apps that exploit the possibilities of the online public domain while complying with international standards on freedom of opinion and expression.

The following 12 main recommendations of the working group are highlighted in the report:

  • Public regulation is needed to impose transparency requirements on online service providers.
    • Transparency requirements should relate to all platforms' core functions in the public information ecosystem: content moderation, content ranking, content targeting, and social influence building.
    • Regulators in charge of enforcing transparency requirements should have strong democratic oversight and audit processes.
    • Sanctions for non-compliance could include large fines, mandatory publicity in the form of banners, liability of the CEO, and administrative sanctions such as closing access to a country's market.
  • A new model of meta-regulation with regards to content moderation is required.
    • Platforms should follow a set of Human Rights Principles for Content Moderation based on international human rights law: legality, necessity and proportionality, legitimacy, equality, and non-discrimination.
    • Platforms should assume the same kinds of obligation in terms of pluralism that broadcasters have in the different jurisdictions where they operate. An example would be the voluntary fairness doctrine.
    • Platforms should expand the number of moderators and spend a minimal percentage of their income to improve quality of content review - particularly in at-risk countries.
  • New approaches to the design of platforms have to be initiated.
    • Safety and quality standards of digital architecture and software engineering should be enforced by a Digital Standards Enforcement Agency. The Forum on Information and Democracy could launch a feasibility study on how such an agency would operate.
    • Conflicts of interests of platforms should be prohibited in order to avoid the information and communication space being governed or influenced by commercial, political, or any other interests.
    • A co-regulatory framework for the promotion of public interest journalistic contents should be defined, based on self-regulatory standards such as the Journalism Trust Initiative (see Related Summary below); friction to slow down the spread of potentially harmful viral content should be added.
  • Safeguards should be established in closed messaging services when they enter into a public space logic.
    • Measures that limit the virality of misleading content should be implemented through limitations of some functionalities, opt-in features to receive group messages, and measures to combat bulk messaging and automated behaviour.
    • Online service providers should be required to better inform users regarding the origin of the messages they receive, especially by labelling those that have been forwarded.
    • Notification mechanisms of illegal content by users and appeal mechanisms for users that were banned from services should be reinforced.
Source

Forum on Information and Democracy website on August 8 2022. Image credit: Forum on Information and Democracy

Video