Media development action with informed and engaged societies
After nearly 28 years, The Communication Initiative (The CI) Global is entering a new chapter. Following a period of transition, the global website has been transferred to the University of the Witwatersrand (Wits) in South Africa, where it will be administered by the Social and Behaviour Change Communication Division. Wits' commitment to social change and justice makes it a trusted steward for The CI's legacy and future.
 
Co-founder Victoria Martin is pleased to see this work continue under Wits' leadership. Victoria knows that co-founder Warren Feek (1953–2024) would have felt deep pride in The CI Global's Africa-led direction.
 
We honour the team and partners who sustained The CI for decades. Meanwhile, La Iniciativa de Comunicación (CILA) continues independently at cila.comminitcila.com and is linked with The CI Global site.
Time to read
3 minutes
Read so far

Side-stepping Rights: Regulating Speech by Contract

0 comments
Summary

"While social media companies have had a positive effect on freedom of expression, they have also come to hold enormous power over what information we can access."

In this policy brief, ARTICLE 19 examines the compliance of dominant social media platforms - Facebook, Twitter, and YouTube (owned by Google) - with international freedom of expression standards, and offers recommendations on what companies should do to demonstrate their commitment to protecting of freedom of expression. ARTICLE 19 puts forward that although social media companies are in principle free to restrict content on the basis of freedom of contract, they should respect human rights, including the rights to freedom of expression, privacy, and due process.

As stated in the brief, "While freedom of expression has generally enjoyed high levels of protection on social media platforms, they have increasingly had to address the human rights concerns of the various communities they seek to attract on their platforms. They are also under constant pressure from governments to remove content deemed harmful or illegal under respective national laws. Online censorship is therefore increasingly privatised. This raises serious questions for the protection of freedom of expression online." These questions include:

  • What free speech standards should social media companies respect?
  • Given that social media companies are effectively services provided by private companies, can they be required to comply with international standards on freedom of expression?
  • Does the quasi-public nature of some of these online spaces call for a different type of regulation?
  • What are the minimum procedural safeguards companies should respect to ensure strong protection of freedom of expression?

This policy brief seeks to answer these and other questions in light of international standards on freedom of expression. It is divided into five parts.

First, the brief sets out the applicable international standards for the protection of freedom of expression online, particularly as it relates to social media companies. Second, it lays down the key issues that arise in relation to the privatisation of speech regulation (i.e., the regulation of speech by contract). ARTICLE 19 finds that it raises serious concerns for the protection of freedom of expression, in particular related to lack of transparency and accountability, lack of procedural safeguards, lack of remedy for the wrongful removal of content, unfair contract terms, lower free speech standards, and circumventing the rule of law.

Third, to demonstrate the outlined problems with content removal, ARTICLE 19 analyses selected aspects of the terms of service of the dominant global social media companies: Google, YouTube, Twitter, and Facebook. When users join social media, they agree to the companies' terms of service, which determine the types of content the respective company deems acceptable (or not). Since a detailed analysis of the entire terms of service is beyond the scope of this brief, it examines content restrictions in the areas that are most often found to be problematic - in particular, hate speech, "terrorist" and "extremist" content, so-called "fake news", and privacy and morality-based restrictions on content. The brief also examines procedural issues related to the removal of content on the basis of the respective terms of service.

Fourth, the brief examines the various policy options available to regulate social media platforms, which include: regulation by the state; co-regulation, often involving private regulation bodies supported by the state; and self-regulation, which relies on voluntary compliance. Rather than seeking to regulate or co-regulate, ARTICLE 19 believes that:

Lastly, ARTICLE 19 makes recommendations that are designed to help ensure that social media companies respect basic human rights standards. In brief:

Recommendations to states:

  • States should adopt laws that shield social media companies from liability for third-party content and refrain from adopting laws that would make them subject to broadcasting regulatory authorities or other similar public authorities.
  • States should refrain from putting undue extra-legal pressure on social media companies to remove content.
  • States should provide for a right to an effective remedy for violations of freedom of expression by social media companies.

Recommendations to social media companies:

  • Companies should ensure that their terms of service are sufficiently clear, accessible, and in line with international standards on freedom of expression and privacy. They should also provide more detailed examples or case studies of the way in which their community standards are applied in practice.
  • Companies should be more transparent about their decision-making processes, including the tools they use to moderate content, such as algorithms and trusted flagger-schemes.
  • Companies should ensure that sanctions for non-compliance with their terms of service are proportionate.
  • Companies should put in place internal complaints mechanisms, including for the wrongful removal of content or other restrictions on their users' freedom of expression.
  • Companies should collaborate with other stakeholders to develop new, independent self-regulatory mechanisms.
  • Companies should resist government and court orders in breach of international standards on freedom of expression or privacy.
  • Companies should publish comprehensive transparency reports, including detailed information about content removal requests received and actioned on the basis of their terms of service. Additional information should be provided in relation to appeals processes, including the number of appeals received and their outcome.
Source

ARTICLE 19 website, June 4 2020. Image credit: ARTICLE 19