The Agentopia Times: Understanding and Mitigating Hallucinations in Multi-Agent LLM Systems via Data Journalism Gameplay
 Yilin Lu -
 Shurui Du -
 Qianwen Wang -

 Download camera-ready PDF
 Download Supplemental Material
Room: Room 0.94 + 0.95
Keywords
LLM, visualization generation, educational game, LLM hallucination, Multi-Agent
Abstract
Large language models (LLMs) are increasingly used to support data analysis and visualization tasks, but remain prone to hallucinations. Recent work suggests that multi-agent systems (MAS) can mitigate hallucinations by enabling internal validation and cross-verification. However, learning effective MAS coordination strategies to mitigate hallucination remains challenging, particularly for newcomers, due to the wide range of coordination strategies and the lack of interactive, hands-on learning tools. To address this, we present The Agentopia Times, an educational game that teaches hallucination mitigation through active experimentation with MAS coordination strategies. The Agentopia Times simulates a newsroom where LLM agents collaborate to create data-driven narratives, with users tasked with adjusting communication protocols to manage hallucinated content. The game features a structured mapping between MAS coordination and familiar gameplay mechanics, providing immediate feedback on hallucination outcomes. Through use cases and preliminary user feedback, we demonstrate how The Agentopia Times enables users to explore and mitigate hallucination in MAS.