Platform companies are increasingly being called upon to make ethical decisions regarding speech, and in response, the public is becoming more interested in how content moderation policies are formed by platform employees. At the same time, interest in this topic has overwhelmingly focused on major social media platforms, such as Facebook, which is a rare type of organization when considering content moderation. This report investigates the differences between platform companies of various sizes and examines the tensions and tradeoffs organizations must make as they set rules and guidelines for content online.
Through an interview study with representatives from 10 major platforms, this report explores the resource gaps, in terms of diversity and amount of personnel and technology, which exist in platform companies of different missions, business models, and size of team. In particular, it focuses on three different models of content moderation:
- Artisanal, for platforms such as Vimeo, Medium, Patreon, or Discord;
- Community-Reliant, for platforms such as Wikimedia and Reddit; and
- Industrial approaches, for platforms such as Facebook or Google
As these companies make content policy that has an impact on citizens around the world, they must carefully consider how to be sensitive to localized variations in how issues like context-based speech, like hate speech and disinformation, manifest in different regions and political circumstances. At the same time, due to the scale at which they are operating, these companies are often working to establish consistent rules, both to increase transparency for users and to operationalize their enforcement for employees. This report contends that the three different approaches prioritize this balance between context-sensitivity and consistency differently, depending on resource needs and organizational dynamics.
Understanding these differences and the nuances of each organization is helpful for determining both the expectations we should be placing on companies and the range of solutions that need to be brought to bear, including existing legislation such as Section 230 of the Communications Decency Act in the United States and the NetzDG rule in Germany. This is important for the artisanal organizations that need to formalize their logic to address concerns more consistently. And it is also important for the industrial-sized operations that need to translate values into training and evaluations while being sensitive to the individual differences of content, such as hate speech and newsworthiness.