Social media platforms have been notoriously opaque about how they work. But something may have shifted.
Last week, several social media platforms took significant steps toward greater transparency, particularly around content moderation and data privacy. Facebook published a major revision of its Community Standards, the rules that govern what users are prohibited from posting on the platform. The changes are dramatic, not because the rules shifted much but because Facebook has now spelled out those rules in much, much more detail.
YouTube released its latest transparency report, and for the first time included data on how it handles content moderation, not just government takedown requests. And dozens of platforms alerted their users of updates to their privacy policies this week, in anticipation of the General Data protection Regulation (GDPR) out of Europe, which goes into effect May 25.
What can we learn from these gestures of transparency? And what do they mean for the problem of content moderation?
Read the full article on NiemanLab.