How journalists’ knowledge and AI can help fight information disorder at scale
- Publication Type:
- Thesis
- Issue Date:
- 2025
Open Access
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is open access.
Information disorder—including mis-, dis- and mal-information—threatens democratic processes, civic trust, and public well-being. Rapid developments in generative artificial intelligence (AI) worsen this landscape by enabling mass production of highly convincing but false content. Although scholars have examined technical and journalistic responses to misinformation, limited attention has been paid to how journalists and technologists might collaborate to achieve large-scale moderation that is both accurate and aligned with public interest values.
𝗠𝗲𝘁𝗵𝗼𝗱𝘀
This study employed a structured qualitative approach, conducting 14 interviews with eight journalists and six technologists. Transcripts were coded using discourse analysis methods, supplemented by a thorough review of literature on misinformation, automated content moderation and journalistic verification practices. This methodology aimed to illuminate where and how journalistic input could enhance AI-driven content moderation systems.
𝗥𝗲𝘀𝘂𝗹𝘁𝘀
Participants agreed that misinformation and disinformation were urgent, multi-faceted issues. Journalists tended to focus on the societal impacts of information disorder, whereas technologists focused more on practical implementation. But both groups saw the need for rapid verification, nuanced cultural context and ethical oversight. Technologists also tended to have a more positive view of journalism than journalists themselves. Respondents identified a wide range of interventions that journalists could undertake—providing timely ground-truth data for AI training, flagging emergent misinformation trends before they go viral, and refining content policies with journalistic expertise in verification.
𝗗𝗶𝘀𝗰𝘂𝘀𝘀𝗶𝗼𝗻
Findings suggest that journalists can strengthen platform moderation by offering high-quality data, cross-cultural insight, and real-time fact-checking capabilities. A more radical proposal involves outsourcing some content moderation to journalistic organisations, who could provide embedded teams of journalists to work alongside content moderation teams at platforms in a combined advisor and watchdog role. The aim of this is instilling in platforms a stronger “public interest” ethos that results in them being a more responsible steward of the information ecosystems in which they have become dominant players.
Please use this identifier to cite or link to this item:
