Deciphering Explicit and Implicit Features for Reliable, Interpretable, and Actionable User Churn Prediction in Online Video Games
 Xiyuan Wang -
 Laixin Xie -
 He Wang -
 Xingxing Xing -
 Wei Wan -
 Ziming Wu -
 Xiaojuan Ma -
 Quan Li -

 Screen-reader Accessible PDF
 DOI: 10.1109/TVCG.2024.3487974
Room: Hall M1
Keywords
Games, Predictive models, Social networking (online), Prediction algorithms, Computational modeling, Industries, Visual analytics, Reliability, Interviews, Explainable AI
Abstract
The burgeoning online video game industry has sparked intense competition among providers to both expand their user base and retain existing players, particularly within social interaction genres. To anticipate player churn, there is an increasing reliance on machine learning (ML) models that focus on social interaction dynamics. However, the prevalent opacity of most ML algorithms poses a significant hurdle to their acceptance among domain experts, who often view them as “opaque models”. Despite the availability of eXplainable Artificial Intelligence (XAI) techniques capable of elucidating model decisions, their adoption in the gaming industry remains limited. This is primarily because non-technical domain experts, such as product managers and game designers, encounter substantial challenges in deciphering the “explicit” and “implicit” features embedded within computational models. This study proposes a reliable, interpretable, and actionable solution for predicting player churn by restructuring model inputs into explicit and implicit features. It explores how establishing a connection between explicit and implicit features can assist experts in understanding the underlying implicit features. Moreover, it emphasizes the necessity for XAI techniques that not only offer implementable interventions but also pinpoint the most crucial features for those interventions. Two case studies, including expert feedback and a within-subject user study, demonstrate the efficacy of our approach.