Michelle Dowling, Nathan Wycoff, Brian Mayer, John Wenskovitch, Scotland C. Leman, Leanna L. House, Chris North
Analysts face many steep challenges when performing sensemaking tasks on collections of textual information larger than can be reasonably analyzed without computational assistance. To scale up such sensemaking tasks, new methods are needed to interactively integrate human cognitive sensemaking activity with machine learning. Towards that goal, we offer a human-in-the-loop computational model that mirrors the human sensemaking process, and consists of foraging and synthesis sub-processes. We model the synthesis loop as an interactive spatial projection and the foraging loop as an interactive relevance ranking combined with topic modeling. We combine these two components of the sensemaking process using semantic interaction such that the human's spatial synthesis actions are transformed into automated foraging and synthesis of new relevant information. Ultimately, the model's ability to forage as a result of the analyst's synthesis activities makes interacting with big text data easier and more efficient, thereby facilitating analysts' sensemaking ability. We discuss the interaction design and theory behind our interactive sensemaking model. The model is embodied in a novel visual analytics prototype called Cosmos in which analysts synthesize structure within the larger corpus by directly interacting with a reduced-dimensionality space to express relationships on a subset of data. We then demonstrate how Cosmos supports sensemaking tasks with a realistic scenario that investigates the affect of natural disasters in Adelaide, Australia in September 2016 using a database of over 30,000 news articles.
- Date of publication:
- April 25, 2019
- Big Data Research
- Page number(s):
- Publication note:
Michelle Dowling, Nathan Wycoff, Brian Mayer, John E. Wenskovitch, Scotland Leman, Leanna House, Nicholas F. Polys, Chris North, Peter Hauck: Interactive Visual Analytics for Sensemaking with Big Text. Big Data Res. 16: 49-58 (2019)