Social media platforms provide unprecedented opportunities for citizens, political candidates, activists and civil society groups to communicate, but they also pose new challenges for democracy. One key problem is the rise of harmful speech online, which can undermine democratic participation and debate. Harmful speech refers to a range of forms of problematic communication, including hate speech, threats of violence, defamation and harassment.
Canada has well-established policies to address the most toxic forms of harmful speech in non-digital media, and some are applicable to harmful speech online. However, the current regulatory approaches cannot address the speed, scale and global reach of harmful speech on social media platforms. Today, most decisions about Canadians’ exposure to harmful speech are made by foreign social media companies, with little public input or accountability. As a result, there is an imbalance between the limited democratic oversight of online platforms and the significant threat that harmful speech poses to democracy.
This report explains some of the most problematic forms of harmful speech, how they affect democratic processes, and how they are currently addressed in Canada. The report draws lessons from policy responses that are being developed or implemented in other countries and at the international level. It then sets out three mutually supporting policy recommendations for the Canadian context. In brief, the report proposes that the Canadian government and key stakeholders should:
1. Implement a multi-track policy framework to address harmful speech:
- An inter-agency task force should be created immediately to clarify how governments in Canada can better apply existing regulatory measures to address harmful speech online, and to examine the growing role of social media platforms in regulating free expression.
- The federal government should set clear expectations for social media companies to provide information about harmful speech to the public and to researchers.
- The federal government should launch a multi-stakeholder commission to examine the social and political problems posed by harmful speech online and identify solutions that fall outside of current regulatory measures. This commission would contribute to a broader Canadian discussion regarding public input and oversight of online content moderation.
2. Develop a Moderation Standards Council:
- The multi-stakeholder commission should consider the creation of a Moderation Standards Council (MSC), analogous to the Canadian Broadcast Standards Council, but adapted for the specific context of online content.
- The MSC would enable social media companies, civil society and stakeholders to meet public expectations and government requirements on content moderation. It would improve transparency and help internet companies develop and implement codes of conduct on addressing harmful speech. It would create an appeals process to address complaints about content moderation policies and decisions. It would also address potential jurisdictional conflicts over the regulation and standards of content moderation within Canada and contribute to international standards-making.
3. Build civil society capacity to address harmful speech online:
- Compared to other countries, Canada lacks robust research and civil society programs to address harmful speech. Governments, universities, foundations and private companies should provide significant support for research in these areas and programs to develop, test and roll out measures to respond to them.
- Social media companies and stakeholders should create an “election contact group” to quickly and effectively share information about threats to electoral integrity.
These policies can help government, internet companies and civil society work together to create a digital public sphere in Canada that is healthy, inclusive and democratic.