The effectiveness of moderating harmful online content.
- Publisher:
- Proceedings of the National Academy of Sciences
- Publication Type:
- Journal Article
- Citation:
- Proc Natl Acad Sci U S A, 2023, 120, (34), pp. e2307360120
- Issue Date:
- 2023-08-22
Open Access
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is open access.
In 2022, the European Union introduced the Digital Services Act (DSA), a new legislation to report and moderate harmful content from online social networks. Trusted flaggers are mandated to identify harmful content, which platforms must remove within a set delay (currently 24 h). Here, we analyze the likely effectiveness of EU-mandated mechanisms for regulating highly viral online content with short half-lives. We deploy self-exciting point processes to determine the relationship between the regulated moderation delay and the likely harm reduction achieved. We find that harm reduction is achievable for the most harmful content, even for fast-paced platforms such as Twitter. Our method estimates moderation effectiveness for a given platform and provides a rule of thumb for selecting content for investigation and flagging, managing flaggers' workload.
Please use this identifier to cite or link to this item: