Context and Localization in Content Governance, from Algorithmic-Driven to Decentralized Platform
This session will explore the role of context and localization in content governance. Beyond language and translation challenges, moderating and curating content in a way that truly prevents harm while enabling human rights for all requires assessing individual content in light of the local social, political, and cultural context in which it’s shared. Centering local context and lived experiences in product design, policy development, and enforcement is a key component of moderating hate speech, online harassment, incitement to violence, and mis/disinformation. This is especially important – and yet severely lacking – for stakeholders outside the US and Western Europe, especially racialized persons, women and non-binary persons, migrants and refugees, LGBTQI+, children and the elderly, disabled persons, and those of lower socio-economic status, among others. Indeed, while leading social media platforms are primarily based in the Global North, their impacts are far-reaching. Nowhere else is this more striking than in the context of conflict, such as in Palestine, Ethiopia, Ukraine, or Afghanistan, where platforms have historically fallen short of adequately moderating content. Yet today’s conversations around content moderation and curation are generally confined to US/Canadian and European borders and values. Substantive change in the way content moderation policies are designed, developed, and enforced around the world, is urgently needed to prevent adverse impacts that the monolithic approach of digital platforms has over the Global South. Yet for this endeavor to ever be successful, representatives from the Global South must not only be included in, but also drive content governance and enforcement. This begins with identifying local problems, and meaningfully participating in developing and implementing solutions. Speakers will highlight challenges from inadequate allocation of resources between regions and communities in content governance, to the difficulty of enforcing policies at scale. Through this interactive and participatory session, attendees will collectively explore opportunities for rights-based content moderation. This will include critical thinking about the responsibilities and organizational models of platforms to govern content in the context of emerging technologies, from algorithmic content moderation to decentralized social media platforms.