UN Expert: Content moderation should not trample free speech


“One of the greatest threats to online free speech today is the murkiness of the rules,” said David Kaye, UN Special Rapporteur on freedom of opinion and expression. “States circumvent human rights obligations by going directly to the companies, asking them to take down content or accounts without going through legal process, while companies often impose rules they have developed without public input and enforced with little clarity. We need to change these dynamics so that individuals have a clear sense of what rules govern and how they are being applied.”

A woman use a computer in an IDP camp. © UN Photo/JC McIlwaineThe internet enables global sharing of and access to information, but its perception has taken a dark turn in recent years. Governments and the public often see hate, abuse and disinformation in the content generated by users, Kaye said. Fears over disinformation, terrorism, online abuse, hate and xenophobia have led some governments to be heavy-handed in their responses to content regulation.

Kaye recently released his latest thematic report that looks into the regulation of content online by States as well as by content providers. The report, which he presented to the Human Rights Council in June 2018, makes the point that moderating content online can be easily and clearly accomplished  – so long as you remember that human rights laws and rules exist in the digital world as well.

 “It is pretty clear that so much of our conversation, so much reporting, sharing of information is online today,” he said. “But so much is shaped by rules that are largely hidden from view. When I am online, I may not see those rules…but those rules do exist. This report is trying to highlight how that regulation takes place.”

A survey by Freedom House, a democracy and rights watchdog organization, found 65 percent of the countries it reviewed asked online platforms to restrict content of a political, social, or religious nature. But such attempts by governments at heavy-handed regulation impedes free expression and can have a chilling effect on companies who may self-censor content because of government pressure and to avoid penalties.

“It is not surprise that internet platforms are facing unprecedented pressure to comply with state laws to regulate content,” the Association for Progressive Communications (APC) reported. APC is a non-profit NGO that works to ensure free and open access to the internet. They, along with dozens of organizations and Governments, responded to Kaye’s call for submissions in preparing the report. “In fact, online platforms are subject to opposing demands: one asking them to thoroughly police the content posted on their services to guarantee the respect of national laws, and the other objecting to them making those determinations on their own and exercising proactive monitoring for fear of detrimental human rights implications.”

As a solution for heavy-handed opaque content moderation, Kaye calls for “radical transparency” for both online platforms and States. This kind of transparency includes knowing what rules States and companies use to moderate content, the rules regarding content, how those rules are applied, what kind of appeals process exists and what kind of accountability there is for wrongful take down of content.

 “It is not that moderating the content online curtails the right to freedom of expression per se,” Kaye said. “Moderation in and of itself is not a problem, it is just that it needs to conform to human rights standards and rule of law standards.”

19 July 2018

Watch David Kaye explain how companies and Governments can keep online platforms from becoming “cesspools of hatred” below:

See also