Content regulation in the digital age - June 2018 Human Rights Council Report,
Call for submissions

In order to collect the widest possible range of information, the Special Rapporteur encourages all interested parties to provide input in keeping with the objectives spelled out in the Concept Note. In particular:

A. The Special Rapporteur invites States to share information concerning:

  • Legislative measures, administrative regulations, judicial decisions, and other policies and measures that impose obligations on social media and search platforms and/or platform users to remove, restrict, or otherwise regulate online content; and
  • Requests or demands, informal or formal, to these platforms to voluntarily remove, restrict, or otherwise regulate content.  

The Special Rapporteur also requests that States share information and analyses about how the laws, regulations, and requests identified in (1) and (2) are consistent with their obligations under Article 19 of the International Covenant of Civil and Political Rights, Article 19 of the Universal Declaration of Human Rights, and other relevant human rights standards.

B. The Special Rapporteur also invites civil society, companies, and all other interested persons or organizations to share comments and/or existing work product focusing on one or more of the following questions:

  1. Company compliance with State laws:
    • What processes have companies developed to deal with content regulation laws and measures imposed by governments, particularly those concerning:
      • Terrorism-related and extremist content;
      • False news, disinformation and propaganda; and/or
      • The “right to be forgotten” framework?
    • How should companies respond to State content regulation laws and measures that may be inconsistent with international human rights standards?
  2. Other State Requests:  Do companies handle State requests for content removals under their terms of service differently from those made by non-State actors? Do companies receive any kind of content-related requests from States other than those based on law or the company’s terms of service (for example, requests for collaboration with counter speech measures)? 
  3. Global removals: How do / should companies deal with demands in one jurisdiction to take down content so that it is inaccessible in other jurisdictions (e.g., globally)?
  4. Individuals at risk: Do company standards adequately reflect the interests of users who face particular risks on the basis of religious, racial, ethnic, national, gender, sexual orientation or other forms discrimination?
  5. Content regulation processes: What processes are employed by companies in their implementation of content restrictions and takedowns, or suspension of accounts? In particular, what processes are employed to:
      • Moderate content before it is published;  
      • Assess content that has been published for restriction or take down after it has been flagged for moderation; and/or
      • Actively assess what content on their platforms should be subject to removal?
  6. Bias and non-discrimination: How do companies take into account cultural particularities, social norms, artistic value, and other relevant interests when evaluating compliance with terms of service? Is there variation across jurisdictions? What safeguards have companies adopted to prevent or redress the takedown of permissible content?
  7. Appeals and remedies: How should companies enable users to appeal mistaken or inappropriate restrictions, takedowns or account suspensions? What grievance mechanisms or remedies do companies provide?
  8. Automation and content moderation: What role does automation or algorithmic filtering play in regulating content? How should technology as well as human and other resources be employed to standardize content regulation on platforms?
  9. Transparency:
      • Are users notified about content restrictions, takedowns, and account suspensions? Are they notified of the reasons for such action? Are they notified about the procedure they must follow to seek reversal of such action?
      • What information should companies disclose about how content regulation standards under their terms of service are interpreted and enforced? Is the transparency reporting they currently conduct sufficient?
  10. Examples: Please share any examples of content regulation that raise freedom of expression concerns (e.g., account suspension or deactivation, post or video takedown, etc.), including as much detail as possible.

Formatting

Please make sure that your submission is paginated. Please submit comments in a single document.

How to Submit

Submissions will be posted on the OHCHR website at the time of the report’s publication, except for non-State submissions containing a clear request not to be made public.

The Special Rapporteur will be evaluating submissions as they arrive and will accept them immediately. While early submissions are encouraged, please submit comments no later than 20 December 2017 to freedex@ohchr.org using the email title: “Submission to study on social media, search, and freedom of expression.” Given the large volume of e-mails received, please note that submissions without this e-mail title may be lost or inadvertently deleted.