DashSpace
“The fundamental design principle of DashSpace is to provide a standards-based open web development platform for building IA/UA software for use with both handheld and headmounted XR displays.”
Abstract
We introduce DashSpace, a live collaborative immersive and ubiquitous analytics (IA/UA) platform designed for handheld and head-mounted Augmented/Extended Reality (AR/XR) implemented using WebXR and open standards. To bridge the gap between existing web-based visualizations and the immersive analytics setting, DashSpace supports visualizing both legacy D3 and Vega-Lite visualizations on 2D planes, and extruding Vega-Lite specifications into 2.5D. It also supports fully 3D visual representations using the Optomancy grammar. To facilitate authoring new visualizations in immersive XR, the platform provides a visual authoring mechanism where the user groups specification snippets to construct visualizations dynamically. The approach is fully persistent and collaborative, allowing multiple participants—whose presence is shown using 3D avatars and webcam feeds—to interact with the shared space synchronously, both co-located and remotely. We present three examples of DashSpace in action: immersive data analysis in 3D space, synchronous collaboration, and immersive data presentations.
Publication
Marcel Borowski, Peter W. S. Butcher, Janus Bager Kristensen, Jonas Oxenbøll Petersen, Panagiotis D. Ritsos, Clemens N. Klokmose, and Niklas Elmqvist. 2025. DashSpace: A Live Collaborative Platform for Immersive and Ubiquitous Analytics. In IEEE Transactions on Visualization and Computer Graphics (TVCG). IEEE Computer Society. DOI: 10.1109/TVCG.2025.3537679. PDF.