Feature left bracketright bracket Current Issue

Windows Into the Brain

In the User Experience and Decision Making Laboratory, researchers are using eye-tracking technology to help identify anxiety and chronic pain.

From left, Professor Soussan Djamasbi with PhD Students Ashwin Sukumar, Doaa Alrefaei, and Gaayathri Sankar outside the User Experience and Decision Making Laboratory.

From left, Professor Soussan Djamasbi with PhD Students Ashwin Sukumar, Doaa Alrefaei, and Gaayathri Sankar

William Shakespeare once wrote that the eyes are the windows into the soul. But for researchers who want to know how we take in and process information, how we make decisions, and how we cope with taxing conditions like anxiety and chronic pain, the eyes can be windows into our brains.

They theorize that by observing how we move our eyes and what catches our attention, it’s possible to gain objective insights into our cognitive processes—knowledge that is more difficult, if not impossible, to obtain in any other way.

Technology to track eye movements is the centerpiece of the User Experience and Decision Making (UXDM) Laboratory, a research center established in 2012 by Professor Soussan Djamasbi of The Business School. With collaborators across the university and around the world, the lab has become a leader in using eye-tracking and other user-experience tools to study a wide range of subjects that involve how we think and what we think about.

Tracking our attention

Djamasbi joined the WPI faculty in 2004, shortly after earning her PhD in communications and information science at the University of Hawaii, Manoa. Early on, she advised a series of undergraduate Major Qualifying Projects sponsored by Fidelity Investments. Co-advised by the late Thomas Tullis, then vice president for user experience at Fidelity, the projects used eye-tracking systems made by Tobii, a pioneering company in the field, to understand how users navigated web pages for various Fidelity products and services, and how that content could be more effective.

“When I give workshops,” Djamasbi says, “I show eye-tracking examples from that time, and everyone is still amazed at the insight that we were able to uncover in these projects. Even though eye tracking has been around for a while, people still don’t understand or are not aware of its power in user-experience research.”

Even though eye tracking has been around for a while, people still don’t understand or are not aware of its power in user-experience research.


To appreciate that power of eye tracking, it’s first necessary to understand the basics of human vision. Like cameras, our eyes have lenses that project an image of whatever we are looking at onto the retina, an array of light-sensitive cells in the back of the eyeball. But unlike the photosensitive material in film or a digital camera’s CCD (charge-coupled device), the photoreceptor cells in the retina are not evenly distributed. A small depression in the center of the retina, called the fovea, has a high concentration of cones, cells that allow us to see color and fine detail.

Because of its extra sensitivity, we continually move our eyes to focus points of interest onto the fovea. In fact, unbeknownst to us, our eyes are in near constant motion, focusing on one detail and then quickly darting over to another. Our brains filter out the blurs caused by these quick movements (called saccades) and then assemble the many foveal snapshots into a coherent view. While we remain largely unaware aware of this behind-the-scenes activity, it can be captured by eye-tracking technology.

“I became interested in eye tracking because of something called the eye-mind hypothesis,” Djamasbi says. “It’s the assumption that what we are looking at—our foveal vision—is what we are currently thinking about. This has a great deal of value for research in information processing. What we look at has captured our attention, and attention is a fundamental building block for many complex cognitive processes such as judgment and decision-making.”

PhD student Doaa Alrefaei wears eye-tracking glasses

PhD student Doaa Alrefaei wears eye-tracking glasses.

The eye trackers in WPI’s UXDM Laboratory use high-resolution cameras and infrared light to follow the movement of a user’s pupils. The infrared light projects a small spot, called a glint, onto the surface of the eye. Since this glint remains in the same location when the eye and its pupil move, by measuring the distance and the angle between the pupil and the glint, it is possible to determine where the pupil is aimed on the visual stimulus.

Djamasbi notes that eye-tracking research in the UXDM Lab is advancing user-experience (UX) studies in two major directions. First, it’s helping to develop cost-effective tools and techniques to integrate eye-tracking research into product evaluation and design improvement. Second, it is creating AI-powered tools that use eye movements as input signals to enable bio-responsive applications.

The value of eye tracking in UX research, Djamasbi says, is that it is ecologically valid, which means it doesn’t alter the behavior of the user. “There are other ways to study cognition—brain imaging, for example. But in user-experience research, you want to alter the environment within which a person interacts with the technology as little as possible,” she says. “Otherwise, people will not behave the way they normally would.”

The combination of the unique window it offers into human cognition and its unobtrusiveness makes this technology useful for studying a wide array of information-processing problems. In recent years, for example, studies in the UXDM Lab, conducted with collaborators in The Business School and other WPI schools and departments, and scientists and medical professionals outside the university, have helped with the development of smartphone apps for diabetes patients and for suicide prevention.

As part of a five-year project funded by a $3 million award from the National Science Foundation, the UXDM Lab is helping advance the field of human-robot interactions, for example, by applying advanced UX research to aid in the design of robotic interfaces and assistants for the workplace and by studying how users interact with systems used to remotely operate robots.


From research to innovation

Having to select the best course of treatment for a loved one who is nonresponsive after suffering a serious brain injury (which might mean making a choice between medical interventions to sustain life or palliative care only) is an extreme example of the need to make critical decisions while under stress. Collaborating with Bengisu Tulu, professor in The Business School and an expert in digital health solutions, and Susanne Muehlschlegel, a clinician-researcher specializing in neurology, neurosurgery, and critical care medicine at Johns Hopkins University, the team developed an aid to help users make an informed decision, despite the emotional turmoil that surrounds that choice. The system was developed through discussions with physicians and through eye-tracking studies of volunteers who’ve had to make that difficult call.

During the early stages of product development, Djamasbi says, an iterative design and testing process is employed to gather qualitative feedback from a small group of users in each cycle to improve the design.

In design thinking, we encourage people to open up and share their thoughts so that we can learn about their needs.


“We can further enhance the depth of our insights,” she says, “by incorporating eye-tracking technology. Eye trackers capture moment-to-moment attention patterns, providing a rich, objective dataset on information-processing behavior. In the neuro intensive care unit (ICU) project, our team used eye-tracking in a novel way to gather insight for another research objective.”

The neuro ICU project collected eye-tracking data as its study participants reviewed the decision aid. “Then we showed a video of their gaze to them and asked them to tell us what they were thinking. In doing so, we used a very powerful design-thinking principle. In design thinking, we encourage people to open up and share their thoughts so that we can learn about their needs.”

This, she notes, helps researchers gain deeper insights into user needs, which is critical for designing a system that truly meets those needs. “In this case, we showed participants their gaze videos to engage them in storytelling. By asking them to talk about their thoughts using their gaze as a cue, we were able to uncover key factors that will allow us to better define the success of a system intended for short-term use.”

Creating AI-powered tools

A number of ongoing projects at the UXDM Lab are aimed at developing bio-responsive systems that use eye movements as accurate and reliable measures (called biomarkers) of user experiences, including debilitating conditions like anxiety and chronic pain. The studies, many conducted in partnership with Diane Strong, professor and head of The Business School, make use of a patent earned by Djamasbi and her colleagues for technology that records eye movements and pupil dilation to measure cognitive load. Cognitive load is an indication of how taxed the mind is by internal and external stimuli and mental tasks. The level of cognitive load is directly tied to our ability to focus and make decisions, Djamasbi says.

“We are adaptive decision-makers,” she notes. “Since we have limited cognitive resources, attention and decision-making are really resource allocation problems.”

When the decision environment is challenging—for example, when we have little time or are overwhelmed with too much information—we adjust to make the best decisions we can under the circumstances.

“We want to use sensor-based systems, like eye tracking, to develop what we call neuro-inspired or neuro-responsive technologies—systems that can help people make decisions when their cognitive resources are taxed, either by altering the environment so decision-making is less effortful or by developing smart decision-support systems that recognize when a user is suffering from cognitive load and offer help,” she says.

Heat maps show where a user's eyes go on a screen

Heat maps on a survey about chronic pain can show when users linger on certain answers.

Conditions like anxiety and chronic pain can increase cognitive load. But the way Djamasbi and colleagues identify such disorders is through their effects on attention. People who suffer from pain or anxiety exhibit attentional bias toward information that is related to their concern.

Traditionally, both conditions have been measured with self-reports. Patients with chronic pain, for example, may be asked to fill out questionnaires that ask them to rate, on a Likert scale, the severity of their pain and how seriously it affects their daily lives. But these subjective measures can be unreliable, as some patients may downplay the impact of their anxiety or pain while others may overstate it.

Eye tracking offers a way to overcome this subjectivity, but it must be guided by a strong theory, Djamasbi says. “While eye tracking has been used to study pain and anxiety, it has produced mixed results. This is because the stimulus-task paradigm these studies use has a limited context and does not provide enough opportunities to capture the complex and dynamic nature of attention.”

Pain studies typically expose subjects to words and phrases designed to evoke different responses in those suffering from anxiety or pain and those who are unaffected, but Djamasbi says the exposure time in these studies is fixed and often too short “to either invoke the interruptive function of pain or anxiety (because of their limited context) on attention, or their fixed short time is not adequate for capturing the complex and dynamic nature of attention.”

She and her research team have devised a new stimuli-task paradigm that they’ve shown can reliably detect the level of anxiety or pain in test subjects. For example, knowing that people with chronic pain tend to react strongly to pain-related topics, they asked subjects to fill out the standard chronic pain questionnaires on screen, and then watched to see which parts of the questionnaire they focused on and how long their gaze rested on each part.

One study compared the eye movements of subjects who said they suffered from chronic pain and those who said they were pain free. The researchers noted that the eyes of chronic pain sufferers lingered on the possible responses for questions that asked about their pain experience. They appeared to be expending considerable cognitive effort on the task of choosing among the options, while the pain-free group focused mostly on the questions and only briefly on the responses.

In addition, when asked to rate the intensity of their pain, the gaze of the chronic pain group often hovered over the “worst pain imaginable” response, while those without pain focused more on the “no pain” choice. In a follow-up study, the researchers refined their analysis by quantifying attention differently and found significant differences in viewing behavior between people with and without chronic pain. Compared to pain-free participants, those suffering from chronic pain spent significantly less time reading the questionnaire items and showed fewer shifts in focus while reading the questions. However, this pattern reversed when selecting a response from among five options—chronic pain sufferers spent significantly more time reviewing the choices and exhibited significantly more shifts in focus than their pain-free counterparts.

Additional research showed that eye movements can be an excellent biomarker of chronic pain and can become the basis of an AI-powered clinical decision-support system that can provide physicians with more accurate and useful information when making clinical decisions about their patients.

Djamasbi says this is just one example of how her research team seeks to not just understand how we think and make decisions, but to turn that knowledge into useful and evidence-based tools that can help clinicians and patients make more informed decisions. It is part of what she and Strong call a user experience–driven innovation framework for developing intelligent products and services.

She says she looks forward to seeing more and more of UXDM Lab’s research projects become products that can be deployed in hospitals and medical offices to make life better for patients and doctors and nurses. And as those new technologies move from the UXDM Lab into the real world, their success and value will, in a very real way, lie in the eyes of the beholders.

Reader Comments

0 Comments

Post a Comment

Your email address will not be published. Please fill in all required fields marked *

When posting a comment, you are stating that you have viewed and agree to the posting guidelines.

All comments will be reviewed prior to posting and any comments that violate these guidelines will not be posted.

Other Stories

The Music Man

The Music Man

Dan Sullivan ’77 uses his engineering powers to make life more melodic.

Read Story
Maker of Worlds

Maker of Worlds

Kate Olguin '20 turned a passion for storytelling into a thriving game design career, earning a spot on Forbes 30 Under 30 after launching viral indie hit The Call of Karen.

Read Story
A Relationship Deeper Than Surface Beauty

A Relationship Deeper Than Surface Beauty

Students gain hands-on experience in environmental conservation at the Hawaii Project Center, developing impactful IQPs that deepen their understanding of native ecosystems and Hawaiian culture.

Read Story
Click on this switch to toggle between day and night modes.