GVU Research Showcase: Demos

Location: TSRB 225 People: Ashok Goel, Spencer Rugaber, Emily Weigel Sungeun An, William Broniec, Preethi Sethumadhavan
Accelerating Scientific Discovery using AI technologies

Lab: Design & Intelligence Laboratory

Location: TSRB 225 People: Ashok Goel, David Joyner, Spencer Rugaber, Eric Gregori Varsha Achar, Aditi Dutta, Rohit Mujumdar, Shravani Sistia
As part of the Jill Watson Suite of Tools for Education, Agent Smith enable anyone to build their own agents.

Lab: Design & Intelligence Laboratory

Location: TSRB TBD People: Mark Riedl Lara Martin, Prithviraj Ammanabrolu, Xinyu Wang, Richa Arora, Pradyumna Tambwekar
Improvisational storytelling involves one or more people interacting in real-time to create a story without advanced notice of topic or theme. Human improvisation occurs in an open-world that can be in any state and characters can perform any behaviors expressible through natural language. In this project, we strive toward the grand challenge of computational improvisational storytelling in open-world domains. The goal is to develop an intelligent agent that can sensibly co-create a story with one or more humans through natural language.

Lab: Entertainment Intelligence Lab

Location: TSRB 243 People: Dr. Thad Starner, Dr. Melody Jackson Yuhui Zhao
Most high-speed non-invasive BCI typing systems require intense visual attention and feedback. BrainBraille investigates a more open-loop approach similar to touch typing. BrainBraille enables communication at 20 characters per minute (cpm) by monitoring attempted movements in the motor cortex. using functional Magnetic Resonance Imaging (fMRI). Users attempt to tense the muscles for six body parts: the hands, the feet, the tongue, and the gluteus maximus. Those actions activate the corresponding six regions of the motor cortex, which map to the six dots in a Braille cell.

Lab: BrainLab

Location: TSRB 220 People: Bruce Walker Nadia Fereydooni
There is a growing semi-automated fleet in the US, and the occupants of many of these semi-automated cars are frequently reading, working, watching movies, and even sleeping. To ensure safety and effective design of products, there is a need to examine the issues of: (a) the impact that being in the car has on performing non-driving tasks; and (b) the impact that performing non-driving tasks has on the person’s ability to re-engage in managing the vehicle’s operation. We are currently studying the task of using virtual reality in a vehicle, and its effect on situation awareness and trust.

Lab: Sonification Lab

Location: TSRB 243 People: Thad Starner - thad@cc.gatech.edu, Peter Presti - peter.presti@imtc.gatech.edu, Scott Gilliland - scott.gilliland@gatech.edu Daniel Kohlsdorf - dkohlsdorf6@gatech.edu, Celeste Mason - celeste.m@gatech.edu, Stewart Butler - stewart@ethosnet.net
CHAT (Cetacean Hearing Augmentation & Telemetry) is a wearable underwater computer system, engineered to assist researchers in establishing two-way communication with dolphins. The project seeks to facilitate the study of marine mammal cognition by providing a waterproof mobile computing platform. An underwater speaker and keyboard enables the researchers to generate whistles. The system is equipped with a two channel hydrophone array used for localization and recognition of specific responses that are translated into audio feedback.

Lab: Contextual Computing Group

Location: TSRB 339 People: Betsy DiSalvo Marisol Wong-Villacres, Upon Ehsan, Amber Solomon, Mercedes Pozo Buil, Alfredo Vargas
Parents' engagement in their children's education is key to children's academic success and social development. For many parents in the U.S., engagement is still a struggle partly due to a lack of communication and community-building tools that support the broader ecology of parenting, or parental ecology. Although current technologies have the potential to create opportunities to improve parental engagement, little is known about the impact of existing technology's design onthe parental ecology.

Lab: Culture And Technology Lab (CAT)

Location: TSRB Room 243 People: Thad Starner Cheryl Wang, Kshitish Deo, Aditya Vishwanath
CopyCat and PopSign are two games that help deaf children and their parents acquire language skills in American Sign Language.  95% of deaf children are born to hearing parents, and most of those parents never learn enough sign language to teach their children.  As short-term memory skills are learned from acquiring a language, many deaf children enter school with the short-term memory of fewer than 3 items, much less than hearing children of hearing parents or Deaf children of Deaf parents.  Our systems address this problem directly.

Lab: Contextual Computing Group

Location: TSRB Near 328 People: Neha Kumar Savanthi Murthy
Examining the Data Security and Privacy (DSP) practices of Indian older adults with digital footprints

Lab: TanDEm

Location: TSRB TBD People: John Stasko Eyitemi Moju-Igbene, Darsh Thakkar
As a college student, finding your internship or a job post graduation can be a daunting task. As a novice new to industry or even someone with experience, finding a job tends to be a black box experience. Deciding where, when and how to apply is a process that could benefit from improvement. Career services and fellow peers are great data resources for the job search however often this data is either not accessible or its not in a form that is easy to digest. On the flip side, career professionals such as career service office and program administrators constantly work with students to find

Lab: Information Interfaces Group

Location: TSRB TSRB TBD People: Wei Wang Yixin Duan
This project aimed to help self-driving vehicle drivers understand how the automation works in different driving conditions. Investment in educational resources can encourage the appropriate use of automated vehicle systems, and thus create a safe driving culture and environment. Instruction method and training content were investigated in this study. In-car interactive instructions were constructed to provide knowledge on underlying principles and feature usage.

Lab: DesigNext Lab

Location: TSRB 225 People: Ashok Goel, David Joyner, Spencer Rugaber Ida Camacho, Marissa Gonzales, Eric Gregori
It has been said that Jill Watson is the most famous teaching assistant in the world. Jill's origin actually is quite humble. She was conceived in summer 2015 with the purpose of helping Georgia Tech's Online MS in CS Program (OMSCS) and specifically with my online course on knowledge-based artificial intelligence (KBAI) as a part of OMSCS program (http://www.omscs.gatech.edu/cs-7637-knowledge-based-artificial-intellige...). Jill had a very difficult birth in fall 2015. Jill was quite precocious almost from the beginning.

Lab: Design & Intelligence Laboratory

Location: TSRB 220
Research Areas: Human-Computer Interaction
People: Bruce Walker Brittany Holthausen
To date, trust in automation has only been measured generally; however, types of trust in automation have been proposed in the literature. ituational trust highlights the impact contextual differences on trust development as well as on how much trust influences behavioral outcomes. This is the first measure developed for a situational trust in automation and for any type of trust in automation.

Lab: Sonification Lab

Location: TSRB TSRB HCI lounge People: Dr. Carl DiSalvo Tanuja Sawant
Exploring Trust for Financial Transactions in Human-Smart Speaker Interactions

Lab: GVU Affiliate Projects (No Lab)

Location: TSRB 243 People: Melody Jackson, Thad Starner, Clint Zeagler, Scott Gilliland Giancarlo Valentin, Larry Freil, Ceara Byrne
The FIDO Sensors team is creating wearable technology to allow working dogs to communicate. Assistance dogs can tell their owners with hearing impairments what sounds they have heard; guide dogs can tell their owners if there is something in their path that must be avoided. We will be demonstrating a variety of wearable sensors designed for dogs to activate.

Lab: Animal-Computer Interaction Lab

Location: TSRB TBD People: Dr. Carrie Bruce, Dr. Courtney Crooks Su Fang
Co-creating News Experiences and Reconstructing Context

Lab: GVU Affiliate Projects (No Lab)

Location: TSRB TSRB TBD
Research Areas: Human-Computer Interaction
People: Dr. Richard Henneman Aparna Ramesh, Ruoxue Zhang
With veganism/vegetarianism gaining popularity, more individuals are adopting these diets for health and environmental reasons. This gives rise to several questions: 1. How healthy are the alternative diets of veganism/vegetarianism? 2. Do they pose any risk if practiced without sufficient knowledge and customization? 3. What are some issues vegans/vegetarians face? How can these be better addressed? For our Master’s project, we explored the area of veganism/vegetarianism within the scope of the above questions to conceptualize a solution that best addresses their needs.

Lab: MS-HCI Project Lab

Location: TSRB 209 People: Dr. Anne Pollock, Dr. Nassim Parvin, and Dr. Lewis Wheaton Christina Bui, Thanawit Prasongpongchai, Aditya Anupam, Charles Denton, Shubhangi Gupta, Olivia Cox
Heart Sense takes biometric data from participants and produces captivating visualizations as their bodies react to visual stimuli.

Lab: Design and Social Interaction Studio

Location: TSRB TSRB HCI Lounge People: Dr. David Joyner Suyash Thakare
-

Lab: MS-HCI Project Lab

Location: TSRB 243 People: Melody Jackson and Thad Starner Larry Freil, Ceara Byrne
Detecting EEG in the ear

Lab: BrainLab

Location: TSRB HCI Lounge (If possible)
Research Areas: Human-Computer Interaction
People: Jon Sanford Miyeon Bae
Past research shows that Georgia Tech’s student population suffers from food insecurity and related anxieties. A part of the problem is the lack of a streamlined and accessible information system that effectively provides options and resources that are available. In close collaboration with the dietitian professionals @ GT, this project serves as an improved tool that will help students make better-informed dining choices based on their needs, preferences, and health concerns.

Lab: GVU Affiliate Projects (No Lab)

Location: TSRB 222 People: Bruce Walker, Jeff Wilson Phillip Roberts, Lusenii Kromah
The System for Wearable Audio Navigation (SWAN) serves as a navigation and orientation aid for persons temporarily or permanently visually impaired. SWAN is in the early stages of a software rewrite and technology upgrade. Interaction techniques are being prototyped in Virtual Reality (VR) to support preliminary user studies of new features.

Lab: Sonification Lab

Location: TSRB 209 People: Nassim Parvin Shubhangi Gupta
With an increase in families living away from their elder parents, technology companies are offering AI-based monitoring technologies as a cost-effective solution to look after the growing population of the elderly. I critique the design and use of such technologies as they aim to cater to any emergency 24*7, while disregarding the contextual nature of care and well-being needed for a better quality of life for the elderly. I argue for a situated approach to care, with a focus on ‘interdependence’ and as demanded by the needs of the elderly and the community in which they live. 

Lab: Design and Social Interaction Studio

Location: TSRB 325 People: Brian Magerko Duri Long, Swar Gujrania, Lucas Liu, Cassandra Naomi, Meha Kumar, Jonathan Moon
LuminAI is an interactive art installation that explores the improvisation of proto-narrative movement between humans and virtual AI agents using full body, expressive, movement-based interaction. Interactors can co-create movement with an autonomous virtual agent that learns movement, response, and improvisation directly from interacting with human teachers. It analyses their movement using Viewpoints movement theory.

Lab: Expressive Machinery Lab (formerly ADAM Lab)

Location: TSRB 222
Research Areas:
People: Bruce Walker Brianna Tomlinson, Mike Winters, Chris Latina, Smruthi Bhat, Milap Rane
Students in the Sonification Lab and Center for Music Technology designed Solar System Sonification, an auditory experience of the planets. Using non-speech audio to convey information, they built a musical model of the solar system. Planetariums typically rely on visuals with various levels of speech description, but have not explored using auditory cues to present information about space. Auditory displays, like the ones developed for Solar System Sonification, enable more immersive experiences and make information accessible to people with visual impairments.

Lab: Sonification Lab

Location: TSRB 222 People: Bruce Walker Rachel Stuck
Human-Robot Interaction: Risk and Trust

Lab: Sonification Lab

Location: TSRB 243 People: Melody Moore Jackson, Thad Starner Ceara Byrne
Instrumented Dog Toys

Lab: Animal-Computer Interaction Lab

Location: TSRB TBD People: Courtney Garvin Xuetong Wang, Shizhong Hu
Redesigning the Receipts Through An Balanced Study of HCI and Design

Lab: MS-HCI Project Lab

Location: TSRB 309 People: Maribeth Gandy Coleman, Laura Levy, Paul M.A. Baker Siyan Zhou, Xinhui Yang
Despite the potential of AR, there are many issues that practitioners must overcome when successfully implementing AR systems in educational settings. These issues range from technological issues, pedagogical issues, and learning issues. Within the context of training and instructional applications, AR practitioners often struggle to achieve their desired outcomes due to the lack of generalizable results from existing research relating AR/MR/VR deployments for real-world training. Moreover, there is no established conceptual framework for Synthetic Learning Environment (SLE) design options, de

Lab: Interactive Media Technology Center (IMTC)

Location: TSRB TBD People: Betsy DiSalvo Yi He, Xuejin Tan
Preparing for a CS technical interview indicates self-regulated learning activities like goal setting and systematic knowledge reviewing. Existing platforms typically support massive problem practices with less reflective affordance. Notably, the approach could incur extraneous cognitive load when the interview is imminent. Since note-taking is a learning strategy with prominent cognitive benefits in deeper processing and external storage function, we design a light weight tool that converts personal notes into self-testing options to enable an effective but less stressful preparation method.

Lab: Culture And Technology Lab (CAT)

Location: TSRB 222 People: Bruce Walker Brittany Noah, Thomas Gable
Automated safety systems, a first step toward autonomous vehicles, are already available in many commercial vehicles. These are systems such as adaptive cruise control, which has the capability to slow down due to traffic, and automatic lane keeping, which maintains position within a lane without driver intervention. In order to ensure that these systems are properly used by drivers it is essential that they understand and appropriately trust the technology.

Lab: Sonification Lab

Location: TSRB 209 People: Dr. Nassim Parvin Pragati Singh
Abstracting of the city's gaze using critical design

Lab: Design and Social Interaction Studio

Location: TSRB 113 People: Anne Sullivan Anna Malecki
Investigating and designing for the social and cultural work book clubs carry into their online counterparts.

Lab: MS-HCI Project Lab

Location: TSRB 338 People: Amy Bruckman Julia Deeb-Swihart
This project focuses on building technology for law enforcement working on human trafficking cases. We leverage available data to build tools that help law enforcement identify potential victims and collaborate with partners to best intervene in these cases. 

Lab: Electronic Learning Communities

Location: TSRB 334 People: Alex Endert Emily Wall
People increasingly rely on visual representations of information to explore and make sense of data. However, people have inherent biases that often lead to errors and inefficiencies in the decision making process. The goal of our research is to help people make better decisions while exploring and analyzing data using visualizations. We introduce computational methods for quantifying an analyst’s biases based on their interactions in the visualization. Using that information, we illustrate ways to modify or design new visualization systems that mitigate biased decision making.

Lab: Visual Analytics Lab

Location: TSRB 325 People: Brian Magerko, Jason Freeman Sara Milkes, Chloe Choy
XyloCode is an interactive museum experience to explore programming concepts in a collaborative and musical way.

Lab: Expressive Machinery Lab (formerly ADAM Lab)

Location: TSRB TBD People: Dr. Rosa Arriaga Matt Golino, Rachel Feinberg
Meditation has entered the digital sphere through apps, podcasts, and online videos. There have been attempts to use virtual reality, but these have focused on single session experiences. We took a user-centered approach to design and implement a VR system to teach people meditation and equip them with skills that can be used beyond the virtual environment. Our research focused on understanding how beginners learn meditation and how their needs could be met in VR. We used this data to create a multi-lesson VR meditation prototype which we then evaluated in a longitudinal lab-based deployment.

Lab: GVU Affiliate Projects (No Lab)