Semantic Explanation of Interactive Dimensionality Reduction
Chris North, Yali Bian
Abstract
Interactive dimensionality reduction helps analysts explore the high-dimensional data based on their personal needs and domain-specific problems. Recently, expressive nonlinear models are employed to support these tasks. However, the interpretation of these human-steered nonlinear models during human-in-the-loop analysis has not been explored. To address this problem, we present a new visual explanation design called semantic explanation. Semantic explanation visualizes model behaviors in a manner that is similar to users’ direct projection manipulations. This design conforms to the spatial analytic process and enables analysts better understand the updated model in response to their interactions. We propose a pipeline to empower interactive dimensionality reduction with semantic explanation using counterfactuals. Based on the pipeline, we implement a visual text analytics system with nonlinear dimensionality reduction powered by deep learning via the BERT model. We demonstrate the efficacy of semantic explanation with two case studies of academic article exploration and intelligence analysis.
Y. Bian, C. North, E. Krokos and S. Joseph, “Semantic Explanation of Interactive Dimensionality Reduction,” 2021 IEEE Visualization Conference (VIS), 2021, pp. 26-30
People
Publication Details
- Date of publication:
- October 24, 2021
- Conference:
- IEEE Visualization Conference (VIS)
- Page number(s):
- 26-30