论文标题

迈向基于语言的可视化创作

Towards Natural Language-Based Visualization Authoring

论文作者

Wang, Yun, Hou, Zhitao, Shen, Leixian, Wu, Tongshuang, Wang, Jiaqi, Huang, He, Zhang, Haidong, Zhang, Dongmei

论文摘要

可视化创作的关键挑战是熟悉创作工具的复杂用户界面的过程。自然语言界面(NLI)由于其可学习性和可用性提供了有希望的好处。但是,支持NLI的创作工具需要自然语言处理方面的专业知识,而现有的NLI则主要是为视觉分析工作流而设计的。在本文中,我们根据形成性研究和对可视化施工工具的广泛调查,提出了以用户可视化编辑意图的结构化表示,提出了一个面向创作的NLI管道。编辑操作是可执行的,因此将自然语言解释和可视化应用程序解释为中间层。我们实施一个基于学习的NL解释器,以将NL话语转化为编辑动作。解释器可以在创作工具之间重复使用且可扩展。创作工具只需要将编辑操作映射到特定于工具的操作中。为了说明NL解释器的用法,我们实现了Excel图表编辑器和概念验证作者Vistalk。我们通过Vistalk进行用户研究,以了解基于NL的创作系统的使用模式。最后,我们讨论了有关用户如何用自然语言绘制用户以及对未来研究的影响的观察。

A key challenge to visualization authoring is the process of getting familiar with the complex user interfaces of authoring tools. Natural Language Interface (NLI) presents promising benefits due to its learnability and usability. However, supporting NLIs for authoring tools requires expertise in natural language processing, while existing NLIs are mostly designed for visual analytic workflow. In this paper, we propose an authoring-oriented NLI pipeline by introducing a structured representation of users' visualization editing intents, called editing actions, based on a formative study and an extensive survey on visualization construction tools. The editing actions are executable, and thus decouple natural language interpretation and visualization applications as an intermediate layer. We implement a deep learning-based NL interpreter to translate NL utterances into editing actions. The interpreter is reusable and extensible across authoring tools. The authoring tools only need to map the editing actions into tool-specific operations. To illustrate the usages of the NL interpreter, we implement an Excel chart editor and a proof-of-concept authoring tool, VisTalk. We conduct a user study with VisTalk to understand the usage patterns of NL-based authoring systems. Finally, we discuss observations on how users author charts with natural language, as well as implications for future research.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源