Distributed Neural Representation for Reactive in situ Visualization
 Qi Wu -
 Joseph A. Insley -
 Victor A. Mateevitsi -
 Silvio Rizzi -
 Michael E. Papka -
 Kwan-Liu Ma -

 Screen-reader Accessible PDF
 Download preprint PDF
 DOI: 10.1109/TVCG.2024.3432710
Room: Hall E2
Keywords
Data visualization, Data models, Computational modeling, Training, Adaptation models, Neural networks, Programming
Abstract
Implicit neural representations (INRs) have emerged as a powerful tool for compressing large-scale volume data. This opens up new possibilities for in situ visualization. However, the efficient application of INRs to distributed data remains an underexplored area. In this work, we develop a distributed volumetric neural representation and optimize it for in situ visualization. Our technique eliminates data exchanges between processes, achieving state-of-the-art compression speed, quality and ratios. Our technique also enables the implementation of an efficient strategy for caching large-scale simulation data in high temporal frequencies, further facilitating the use of reactive in situ visualization in a wider range of scientific problems. We integrate this system with the Ascent infrastructure and evaluate its performance and usability using real-world simulations.