26 Recommendations on Content Governance: A Guide for Lawmakers, Regulators, and Company Policy Makers

Access Now
"Digital platforms theoretically give everyone the opportunity to connect, but repressive governments can interfere with that capacity through legislation that endangers people's rights."
These recommendations on content governance have been published by Access Now, an organisation that seeks to defend and extend the digital rights of users at risk around the world. The document aims to reach decision-makers worldwide with the goal of putting human rights at the forefront of debates about content governance. As stated, "Doing so is the only pathway for creating a digital future that reinforces shared ideals of freedom, openness, and democratic values, with the potential for returning power to the users." The recommendations are intended to be human-rights- and user-centric and, since the context is different for each country and region with regard to specific actors and technologies, they are not designed to be one-size-fits-all prescriptions. Instead, they are designed to serve as a minimum basis for content governance policies to safeguard human rights.
As explained in the paper, "the internet has given us an essential tool to exercise human rights, including access to information and freedom of opinion and expression, among others. Services that act as intermediaries for the flow of information, especially platforms such as social media services and search engines, play an important role in this. Digital platforms theoretically give everyone the opportunity to connect, but repressive governments can interfere with that capacity through legislation that endangers people's rights. At the same time, the rules that platforms use to govern content and user activity - typically developed unilaterally - are often designed and applied in ways that are at odds with freedom of expression, privacy, and other fundamental rights. This, in turn, can enable new forms of exploitation, both by private and public actors. The actions that platforms and governments take in this area, and those they fail to carry out, can harm societies and vulnerable populations in particular." For example, laws, policies, and content moderation practices can impact on journalists, activists, and human rights defenders who depend on free access to information. They can also negatively impact members of oppressed or marginalised groups, such as women, religious or ethnic minority groups, people of colour, and the lesbian, gay, bisexual, transgender, and queer (LGBTQ) community, who depend on the internet to have their voices heard.
The paper starts by looking at how content regulations are made and enforced. It defines the 3 main types of governance structures that are being used today to govern content: state regulation, enforced by governments; self-regulation, exercised by platforms via content moderation or curation; and co-regulation, undertaken by governments and platforms together through mandatory or voluntary agreements.
An exploration of how content governance decisions affect human rights follows. As explained, "Content governance that is incompatible with basic human rights rules and principles imperils free expression, access to information, freedom of opinion, association, privacy, and other human rights, and it impacts different populations in diverse ways." The paper highlights the human rights risks associated with each type of governance structure and offers recommendations to address those risks. For example, related to the state, the paper describes the danger of hasty regulations that are imposed when there is a lack of evidence and information. "Any state regulation addressing online societal phenomena such as disinformation, hate speech, or terrorist content must always be grounded in solid evidence." It warns that regulations that push for speed and quantity of content removal may in turn generate over-compliance by online platforms that results in illegitimate takedowns of user-generated content. Where it is possible, the paper recommends concrete baseline policies to address key issues such as intermediary liability, automated measures, and self-regulation decisions, among others.
In the third section of the document, recommendations for each of the types of governance - state regulation, self-regulation and co-regulation - are offered. A few are highlighted below:
- State regulation:
- Abide by strict democratic principles - A formal legal instrument must contain protective safeguards that are established through a democratic process that respects the principles of multistakeholderism and transparency. They must be proportional to their legitimate aim.
- Enact safe harbours and liability exemptions - Intermediaries should be protected from liability for third-party content by a safe harbour regime; however, Access Now opposes full immunity. Rules that protect intermediaries must enable ways to address the spread of illegal content.
- Self-regulation:
- Evaluate impact - Platforms should perform participatory and periodic public evaluations of content moderation and curation decisions, which includes sharing information proactively with researchers and civil society.
- Be transparent - All content moderation and curation criteria, rules, sanctions, and exceptions should be clear, specific, predictable, and properly informed to users in advance. This includes obtaining valid and informed consent from users regarding the rules that govern their activities on the platform.
- Co-regulation:
- Adopt participatory, clear, and transparent legal frameworks - In order to enable the necessary accountability mechanisms, co-regulatory models should be grounded in a binding legal framework that state actors adopt to provide safeguards for users.
- Don't shift or blur the responsibilities of actors - Governments should not permit or encourage private actors to decide upon the legality or restriction of user-generated content.
Access Now website, May 28 2020. Image credit: Access Now
- Log in to post comments











































