On the occasion of the first Content Moderation at Scale conference in Santa Clara, CA on February 2nd, 2018, a small private workshop of organizations, advocates, and academic experts who support the right to free expression online was convened to consider how best to obtain meaningful transparency and accountability around internet platforms’ increasingly aggressive moderation of user-generated content.
Now, on the occasion of the second Content Moderation at Scale conference in Washington, DC on May 7th, 2018, we propose these three principles as initial steps that companies engaged in content moderation should take to provide meaningful due process to impacted speakers and better ensure that the enforcement of their content guidelines is fair, unbiased, proportional, and respectful of users’ rights.
Markkula Center for Applied Ethics, Santa Clara University
Queensland University of Technology
USC Annenberg School for Communication and Journalism
Department of Information Studies, School of Education & Information Studies, UCLA
Companies should publish the numbers of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines.
At a minimum, this information should be broken down along each of these dimensions:
This data should be provided in a regular report, ideally quarterly, in an openly licensed, machine-readable format.
Companies should provide notice to each user whose content is taken down or account is suspended about the reason for the removal or suspension.
In general, companies should provide detailed guidance to the community about what content is prohibited, including examples of permissible and impermissible content and the guidelines used by reviewers. Companies should also provide an explanation of how automated detection is used across each category of content. When providing a user with notice about why her post has been removed or an account has been suspended, a minimum level of detail for an adequate notice includes:
Notices should be available in a durable form that is accessible even if a user’s account is suspended or terminated. Users who flag content should also be presented with a log of content they have reported and the outcomes of moderation processes.
Companies should provide a meaningful opportunity for timely appeal of any content removal or account suspension.
Minimum standards for a meaningful appeal include:
In the long term, independent external review processes may also be an important component for users to be able to seek redress.
We thank Santa Clara University’s High Tech Law Institute for organizing the Content Moderation & Removal at Scale conference, as well as Eric Goldman for supporting the convening of the workshop that resulted in this document. That workshop was also made possible thanks to support from the Internet Policy Observatory at the University of Pennsylvania. Suzor is the recipient of an Australian Research Council DECRA Fellowship (project number DE160101542).