Content Regulation in the digital age

Concept note for the 2018 thematic report of the UN Special Rapporteur on the promotion and protection of freedom of opinion and expression

Private companies facilitate an unprecedented global sharing of information and ideas. Social and search platforms in particular have become primary sources of news and information (and disinformation) for hundreds of millions of people. With that role they have also become gatekeepers of expression that may excite passions and knowledge – or incite hatred, discrimination, violence, harassment, and abuse. Their impact on the right to seek, receive, and impart information raises two sets of fundamental questions:

  1. Which standards do platforms apply to content under their Terms of Service? How do the standards that currently operate compare to international human rights law? Do these company standards vary according to the jurisdiction where they are accessed? Does the development of standards draw from public, user or other stakeholder input? Do the standards provide meaningful protection for freedom of opinion and expression?  
  2. What processes do platforms implement when evaluating whether content violates standards (i.e., terms of service)? What processes have companies developed to deal with government requests for content regulation? How do companies conduct content flagging and takedowns, appeals and remedies, user notification, and transparency reporting? What role does automation or algorithmic filtering play in regulating content? What steps should platforms, government actors, and others take to ensure that these processes establish adequate safeguards for freedom of expression? 

These questions underlie some of the most urgent challenges to freedom of expression on private platforms today. The spread of “extremist” content online has triggered legislative and corporate responses that may address serious national security and public order threats but may also limit political discourse and activism. The scourge of online gender-based violence has prompted uneven and excessive regulation that not only fails to address its root causes, but also threatens legitimate content. The perceived urgency to address misinformation through ‘fake news’ and online propaganda has generated global confusion about what counts as false or misleading – and who decides.

Governments have a duty to adopt laws and approaches dealing with content regulation consistent with human rights law and to refrain from undue interferences with digital expression. But private actors increasingly regulate expression independently of governments, under standards and processes that are often unclear, and in the shadow of public authorities’ (and the public’s) demands. Terms of service, “community standards,” and other private frameworks adopted by platforms have a profound impact on the freedom of expression of users and deserve human rights scrutiny.

The Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression has reported twice to the United Nations Human Rights Council on private sector responsibilities in digital space: by mapping the Information and Communications Technology (ICT) sector in his 2016 report and by evaluating questions pertaining to the digital access industry in his 2017 report. His June 2018 report will focus on search and social media companies and identify and address priority issues and concerns through the following principal approaches (in addition to research and analysis):

Company visits: The Special Rapporteur plans to visit social media and search companies worldwide, those grappling with the challenges of online content regulation at global and national levels. Visits will aim to develop information about company standards, processes, and concerns.

Submissions: The Special Rapporteur also seeks written submissions from States, civil society, companies, academics, the technical community, and other interested individuals (see below).

Consultations: The Special Rapporteur plans to conduct civil society consultations focusing on thematic and regional concerns around content regulation.

The Special Rapporteur plans to issue a report to the Council on platform content regulation in June 2018, including recommendations about appropriate private company standards and processes and the role that States should play in promoting and protecting freedom of opinion and expression online. Further relevant reporting and information gathering is likely through 2018 and 2019 on the subject.

Call for submissions

In order to collect the widest possible range of information, the Special Rapporteur encourages all interested parties to provide input in keeping with the objectives spelled out in the Concept Note. In particular:

A. The Special Rapporteur invites States to share information concerning:

  1. Legislative measures, administrative regulations, judicial decisions, and other policies and measures that impose obligations on social media and search platforms and/or platform users to remove, restrict, or otherwise regulate online content; and  
  2. Requests or demands, informal or formal, to these platforms to voluntarily remove, restrict, or otherwise regulate content.  

The Special Rapporteur also requests that States share information and analyses about how the laws, regulations, and requests identified in (1) and (2) are consistent with their obligations under Article 19 of the International Covenant of Civil and Political Rights, Article 19 of the Universal Declaration of Human Rights, and other relevant human rights standards.

B. The Special Rapporteur also invites civil society, companies, and all other interested persons or organizations to share comments and/or existing work product focusing on one or more of the following questions:

  1. Company compliance with State laws:
    • What processes have companies developed to deal with content regulation laws and measures imposed by governments, particularly those concerning:
      • Terrorism-related and extremist content;
      • False news, disinformation and propaganda; and/or
      • The “right to be forgotten” framework?
    • How should companies respond to State content regulation laws and measures that may be inconsistent with international human rights standards?
  2. Other State Requests:  Do companies handle State requests for content removals under their terms of service differently from those made by non-State actors? Do companies receive any kind of content-related requests from States other than those based on law or the company’s terms of service (for example, requests for collaboration with counter speech measures)? 
  3. Global removals: How do / should companies deal with demands in one jurisdiction to take down content so that it is inaccessible in other jurisdictions (e.g., globally)?
  4. Individuals at risk: Do company standards adequately reflect the interests of users who face particular risks on the basis of religious, racial, ethnic, national, gender, sexual orientation or other forms discrimination?
  5. Content regulation processes: What processes are employed by companies in their implementation of content restrictions and takedowns, or suspension of accounts? In particular, what processes are employed to:
      • Moderate content before it is published;  
      • Assess content that has been published for restriction or take down after it has been flagged for moderation; and/or
      • Actively assess what content on their platforms should be subject to removal?
  6. Appeals and remedies: How should companies enable users to appeal mistaken or inappropriate restrictions, takedowns or account suspensions? What grievance mechanisms or remedies do companies provide?
  7. Automation and content moderation: What role does automation or algorithmic filtering play in regulating content? How should technology as well as human and other resources be employed to standardize content regulation on platforms?
  8. Transparency:
    1. Are users notified about content restrictions, takedowns, and account suspensions? Are they notified of the reasons for such action? Are they notified about the procedure they must follow to seek reversal of such action?
    2. What information should companies disclose about how content regulation standards under their terms of service are interpreted and enforced? Is the transparency reporting they currently conduct sufficient?
  9. Examples: Please share any examples of content regulation that raise freedom of expression concerns (e.g., account suspension or deactivation, post or video takedown, etc.), including as much detail as possible.


Please make sure that your submission is paginated. Please submit comments in a single document.

How to Submit

Submissions will be posted on the OHCHR website at the time of the report’s publication, except for non-State submissions containing a clear request not to be made public.

The Special Rapporteur will be evaluating submissions as they arrive and will accept them immediately. While early submissions are encouraged, please submit comments no later than 20 December 2017 to freedex@ohchr.org using the email title: “Submission to study on social media, search, and freedom of expression.” Given the large volume of e-mails received, please note that submissions without this e-mail title may be lost or inadvertently deleted.