Data visualization systems have predominantly been developed for WIMP-based direct manipulation interfaces. Only recently have other forms of interaction begun to appear, such as natural language or touch-based interaction, though usually operating only independently. Prior evaluations of natural language interfaces for visualization have indicated potential value in combining direct manipulation and natural language as complementary interaction techniques. Unfortunately, however, little work has been done in exploring such multimodal visualization interfaces. Orko is a visualization we have created to allow people to explore data using speech and touch-based multimodal interaction. With Orko, we try to bridge the gap between existing direct manipulation based interfaces and hypothesized futuristic natural user interfaces for data analysis with visualization.
At the Information Interfaces Lab, computing technologies are developed that help people take advantage of information to enrich their lives. The lab group develops ways to help people understand information via user interface design, information visualization, peripheral awareness techniques and embodied agents. The goal is to help people make better judgments by learning from all the information available to them.