Skip to main content

Eye-tracking for tailored autonomy

April 8, 2025

Smart glasses with gray spheres attached on top.

The eye-tracking glasses, made by Pupil Labs, allow researchers to precisely monitor where subjects look when encountering autonomous systems, helping create more personalized safety parameters.

Adapted from story by Amy Sprague / UW A&A; Photos by Dennis Wise / University of Washington

The future of building trustworthy autonomous systems may lie in wearing glasses. A&A Assistant Professor Karen Leung, with co-Principal Investigator Anat Caspi, director of the Allen School’s Taskar Center for Accessible Technology, has received a $300,000 National Science Foundation grant to explore how specialized eyeglasses could help autonomous vehicles and robots better understand and adapt to human comfort levels. Undergraduate researchers Senna Keesing (A&A), Marc Alwan (CSE) and Kyshawn Warren (UW ECE) are carrying out the research in Leung’s Control and Trustworthy Robotics Lab (CTRL).

This work stems from a simple observation: people aren’t identical in their comfort levels around autonomous systems. “I’ve watched how people interact with autonomous systems in their daily lives,” Leung shares. “What makes one person perfectly comfortable might make another quite nervous. We need to bridge this gap.”

The research team’s approach involves specialized eyeglasses that observe how individuals scan their environment. These insights help autonomous systems understand each person’s unique safety preferences and adapt accordingly. Picture an autonomous wheelchair that learns whether its user prefers to give other pedestrians a wide berth or is comfortable with closer encounters – all while maintaining core safety standards.

UW ECE student Kyshawn Warren (left), and UW students Senna Keesing and Marc Alwan pose with the specialized equipment used to study human-robot interactions. The sensor-equipped helmets track movements and speed, while eye-tracking glasses monitor gaze patterns.

UW ECE student Kyshawn Warren (left), and UW students Senna Keesing and Marc Alwan pose with the specialized equipment used to study human-robot interactions. The sensor-equipped helmets track movements and speed, while eye-tracking glasses monitor gaze patterns. Top right: Marc Alwan models the eye-tracking. Bottom right: The customized hard hats with the Lab’s logo have sensors mounted to the top that track movement.

The research tackles a crucial challenge in autonomous mobility: earning public trust. Traditional autonomous systems operate with fixed safety parameters, potentially making some users uncomfortable while frustrating others with overcautious behavior. Leung’s team aims to create more nuanced systems that can recognize and respond to individual comfort levels.

Beyond wheelchairs, this research could transform how delivery robots navigate college campuses or how autonomous vehicles interact with pedestrians in urban environments. The project combines advances in computer vision, human behavior understanding, and adaptive control systems.

The NSF grant, jointly supported by the Dynamics, Controls, and System Diagnostics and Mind, Machine, and Motor Nexus Programs, underscores the project’s interdisciplinary significance. Leung’s team is particularly focused on including diverse perspectives in their research, actively engaging underrepresented groups in robotics and fostering collaboration between computer vision, controls, and robotics researchers.

Karen Leung, A&A Assistant Professor and Anat Caspi, Director of the Allen School’s Taskar Center for Accessible Technology

Karen Leung, A&A Assistant Professor and Anat Caspi, Director of the Allen School’s Taskar Center for Accessible Technology

 

“We’re not just developing technology. We’re working to create autonomous systems that truly understand and respect human preferences. That’s the key to building trust.”

— Karen Leung, A&A Assistant Professor

 

 

 

Kyshawn Warren models the eye-tracking glasses, which register real-time gaze data on a connected smartphone. Warren is a 4th-year undergraduate in the UW ECE Combined BS-MS program, with a research focus on computer vision for robotic applications and computing, including embedded systems and ASIC design.

Below, Kyshawn Warren monitors a demo of the cameras and accompanying data collection. Warren’s involvement in this project mainly includes utilizing the scene images and gaze location to determine what objects within a person’s view they consider to be safety critical for their navigation, and then tracking those objects as they remain within the person’s view.

“This research has been an eye-opening experience that has given me much insight into how much our brain does subconsciously and how we can visualize these things in a way that computers can learn from and apply for autonomous systems,” says Warren. “Moving forward, my research lab and I will be working on implementing what we have learned, in addition to a path-planning algorithm, onto an autonomous system such as a wheelchair so that there can be autonomous navigation with human preference in mind.”