DashSpace: A Live Collaborative Platform for Immersive and Ubiquitous Analytics
 Marcel Borowski -
 Peter W. S. Butcher -
 Janus Bager Kristensen -
 Jonas Oxenbøll Petersen -
 Panagiotis D. Ritsos -
 Clemens N. Klokmose -
 Niklas Elmqvist -

 Screen-reader Accessible PDF
 Download preprint PDF
 DOI: 10.1109/TVCG.2025.3537679
Room: Hall E1
Keywords
Data visualization, Three-dimensional displays, Collaboration, Grammar, Software, Visualization, Hardware, Data analysis, Mobile handsets, Media
Abstract
We introduce DashSpace, a live collaborative immersive and ubiquitous analytics (IA/UA) platform designed for handheld and head-mounted Augmented/Extended Reality (AR/XR) implemented using WebXR and open standards. To bridge the gap between existing web-based visualizations and the immersive analytics setting, DashSpace supports visualizing both legacy D3 and Vega-Lite visualizations on 2D planes, and extruding Vega-Lite specifications into 2.5D. It also supports fully 3D visual representations using the Optomancy grammar. To facilitate authoring new visualizations in immersive XR, the platform provides a visual authoring mechanism where the user groups specification snippets to construct visualizations dynamically. The approach is fully persistent and collaborative, allowing multiple participants—whose presence is shown using 3D avatars and webcam feeds—to interact with the shared space synchronously, both co-located and remotely. We present three examples of DashSpace in action: immersive data analysis in 3D space, synchronous collaboration, and immersive data presentations.