Title: Nonverbal Communication in Socially Assistive Human-Robot Interaction
Advisor: Brian Scassellati
Other committee members:
Greg Trafton (NRL)
Abstract: Robotics has already improved lives by taking over dull, dirty, and dangerous jobs, freeing people for safer, more skillful pursuits. For instance, autonomous mechanical arms weld cars in factories, and autonomous vacuum cleaners keep floors clean in millions of homes. Robots also help users on a more personal level through direct human-robot interactions. Socially assistive robots provide assistance to human users in social ways. These include robot tutors that provide students with personalized lessons to augment their classroom time, robot therapy assistants that help mediate social interactions between children with developmental disorders and adult therapists, and robot caretakers that assist elderly or disabled people in their homes.
To succeed in their role of social assistance, socially assistive robots must be capable of natural communication. For people, natural communication includes both verbal communication (speech) and nonverbal communication (eye gaze, gestures, and other behaviors). My research focuses on enabling human-robot communication by understanding and generating nonverbal behavior for socially assistive robots. This dissertation investigates how eye gaze and other nonverbal behaviors can be used by socially assistive robots to improve human interactions.
In this dissertation, I detail a series of human-robot interaction studies that draw out human responses to specific robot nonverbal behaviors. These carefully controlled laboratory-based studies investigate robot eye gaze and other nonverbal behaviors in various scenarios: how robot eye gaze compares to human eye gaze in eliciting reflexive attention shifts from human viewers; how different features of robot gaze behavior promote the perception of a robot’s attention toward a viewer; whether people use robot eye gaze to support verbal object references and how they resolve conflicts in this multimodal communication; and what is the role of eye gaze and gesture in guiding behavior during human-robot collaboration.
Based on this understanding of nonverbal behavior in HRI, I develop a set of models for understanding and generating nonverbal behavior in human-robot interactions. The first model is trained on examples from human-human behavior during tutoring. This model can understand the context of a communication from the nonverbal behaviors displayed, as well as suggest appropriate nonverbal behaviors to support a desired communication. The second model is purely a behavior generation model for robots. It can be flexibly applied to a range of scenes and robots, and though it is based on a psychological understanding of human visual processing, it does not need to be trained on examples of human behavior. Finally, I show that this second model performs well in a naturalistic human-robot collaborative interaction.
Developing effective nonverbal communication for robots engages a number of disciplines including autonomous control, machine learning, computer vision, design, and cognitive psychology. This dissertation contributes across all of these disciplines, providing a greater understanding of how robot nonverbal behavior impacts interactions between people and socially assistive robots.
Dissertation draft (frequently updated): https://urldefense.proofpoint.com/v2/url?u=https-3A__www.dropbox.com_s_dc38j1nlg19ze2j_admoni-5Fthesis.pdf-3Fdl-3D0&d=AwICAg&c=-dg2m7zWuuDZ0MUcV7Sdqw&r=iTPj_W4V3cgUIzi0Q86CHtLseAERcA1JMwP0cQq3l54&m=N5PbHCLVx_DOkgANAEdHvTudIQwhlTK4LR41hiCzuyQ&s=2F9LY6ggcHBA0s7QY3nZnmnnGz5xyWwIX6SoXZA-y2E&e=