Report on content regulation
Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression
To the HRC, 38th session
In the first-ever UN report that examines the regulation of user-general online content, the Special Rapporteur examines the role of States and social media companies in providing an enabling environment for freedom of expression and access to information online. In the face of contemporary threats such as "fake news" and disinformation and online extremism, the Special Rapporteur urges States to reconsider speech-based restrictions and adopt smart regulation targeted at enabling the public to make choices about how and whether to engage in online fora. The Special Rapporteur also conducts an in-depth investigation of how Internet companies moderate content on major social media platforms, and argues that human rights law gives companies the tools to articulate their positions in ways that respect democratic norms and counter authoritarian demands.
The report’s main recommendations:
- States should repeal any law that criminalizes or unduly restricts expression, online or offline.
- Smart regulation, not heavy-handed viewpoint-based regulation, should be the norm, focused on ensuring company transparency and remediation to enable the public to make choices about how and whether to engage in online forums. States should only seek to restrict content pursuant to an order by an independent and impartial judicial authority, and in accordance with due process and standards of legality, necessity and legitimacy. States should refrain from imposing disproportionate sanctions, whether heavy fines or imprisonment, on Internet intermediaries, given their significant chilling effect on freedom of expression.
- States and intergovernmental organizations should refrain from establishing laws or arrangements that would require the “proactive” monitoring or filtering of content, which is both inconsistent with the right to privacy and likely to amount to pre-publication censorship.
- States should refrain from adopting models of regulation where government agencies, rather than judicial authorities, become the arbiters of lawful expression. They should avoid delegating responsibility to companies as adjudicators of content, which empowers corporate judgment over human rights values to the detriment of users.
- States should publish detailed transparency reports on all content-related requests issued to intermediaries and involve genuine public input in all regulatory considerations.
- Companies should recognize that the authoritative global standard for ensuring freedom of expression on their platforms is human rights law, not the varying laws of States or their own private interests, and they should re-evaluate their content standards accordingly. Human rights law gives companies the tools to articulate and develop policies and processes that respect democratic norms and counter authoritarian demands. This approach begins with rules rooted in rights, continues with rigorous human rights impact assessments for product and policy development, and moves through operations with ongoing assessment, reassessment and meaningful public and civil society consultation. The Guiding Principles on Business and Human Rights, along with industry-specific guidelines developed by civil society, intergovernmental bodies, the Global Network Initiative and others, provide baseline approaches that all Internet companies should adopt.
- The companies must embark on radically different approaches to transparency at all stages of their operations, from rule-making to implementation and development of “case law” framing the interpretation of private rules. Transparency requires greater engagement with digital rights organizations and other relevant sectors of civil society and avoiding secretive arrangements with States on content standards and implementation.
- Given their impact on the public sphere, companies must open themselves up to public accountability. Effective and rights-respecting press councils worldwide provide a model for imposing minimum levels of consistency, transparency and accountability to commercial content moderation. Third-party non-governmental approaches, if rooted in human rights standards, could provide mechanisms for appeal and remedy without imposing prohibitively high costs that deter smaller entities or new market entrants. All segments of the ICT sector that moderate content or act as gatekeepers should make the development of industry-wide accountability mechanisms (such as a social media council) a top priority.
The report is the culmination of a year-long series of consultations, visits to major internet companies and a wide range of State and civil society input. A
supplementary annex to the report gathers the findings of the consultations and submissions received.
NGOs and Civil Society Organizations
- Laura van der Woude
- Nicolas Suzor