This talk is part of a two-presentation session running from 2:50 PM - 3:40 PM. The session will feature two presentations back-to-back, with Q&A after each presentation.
The EU’s Digital Services Act breathes new life into individual remedy - the right of users to challenge content moderation decisions. It requires platforms to account for fundamental rights when enforcing terms of service and to provide detailed explanations for every decision. They need to provide appeal mechanisms and cooperate with out-of-court dispute settlement (ODS) bodies. The legal obligations bring a new urgency to address the unresolved challenges of at-scale content moderation. This presentation proposes practical solutions for how platforms and ODS bodies should make decisions on content. It describes what a European fundamental rights approach looks like and how it fits with global human rights responsibilities. It explains how human review, decision trees and LLMs can be combined to produce high-quality decisions at-scale. It presents a system of processes, technology and prompting strategies that platforms and ODS bodies can rely on to make individual remedy efficient and meaningful.