CrowdTrace: Visualizing Provenance in Distributed Sensemaking
Tianyi Li, Chris North
Abstract
Capturing analytic provenance is important for refining sensemaking analysis. However, understanding this provenance can be difficult. First, making sense of the reasoning in intermediate steps is time-consuming. Especially in distributed sensemaking, the provenance is less cohesive because each analyst only sees a small portion of the data without an understanding of the overall collaboration workflow. Second, analysis errors from one step can propagate to later steps. Furthermore, in exploratory sensemaking, it is difficult to define what an error is since there are no correct answers to reference. In this paper, we explore provenance analysis for distributed sense-making in the context of crowdsourcing, where distributed analysis contributions are captured in microtasks. We propose crowd auditing as a way to help individual analysts visualize and trace provenance to debug distributed sensemaking. To evaluate this concept, we implemented a crowd auditing tool, CrowdTrace. Our user study-based evaluation demonstrates that CrowdTrace offers an effective mechanism to audit and refine multi-step crowd sensemaking.
Tianyi Li, Yasmine Belghith, Chris North, Kurt Luther: CrowdTrace: Visualizing Provenance in Distributed Sensemaking. IEEE VIS (Short Papers) 2020: 191-195
People
Publication Details
- Date of publication:
- February 1, 2021
- Conference:
- Visualization Conference (VIS)
- Page number(s):
- 191-195