UBC haptic technologies at the forefront of research into emotionally intelligent computers
By making computers smarter about tactile information, researchers are hoping to also make them more human.
Posted by GRAND NCE, March 5, 2013

The haptic creature uses conductive "smart fur" and other sensors to detect nine different forms of touches and react accordingly.
The haptic creature uses conductive "smart fur" and other sensors to detect touch and react accordingly. PHOTO: Anna Flagg.

Imagine machines that could read your emotion by the way you touch them. At The University of British Columbia’s SPIN Lab (Sensory Perception and Interaction Research Group), researchers are experimenting with new haptic systems designed to both decipher and respond to emotionally expressive touch. By making computers smarter about tactile information, SPIN collaborators are hoping to also make them more human.

One project getting attention is the haptic creature or “Cuddlebot,” named in New York Times’ “Innovations That Will Change Your Tomorrow,” and recently discussed in MIT Technology Review. The small furry, featureless robot with conductive “Smart Fur” (developed by former Master’s student Anna Flagg) reads nine different touch gestures – from the slightest of strokes to scratches. In other research they have learned to mimic the breathing and subtle movements of a live animal.

“We’ve done a lot of studies on this robot to make it emotionally expressive. It’s actually pretty good at conveying many kinds of emotions. It can do pretty well in [displaying that it is] excited and relaxed, sleepy and miserable, calm and anxious,” said GRAND Principal Investigator and UBC Computing Science Professor, Dr. Karon MacLean, who heads the SPIN Lab.

Therapeutic applications in development

In its most recent iteration, developed by former master’s student Yasaman Sefidgar in collaboration with MacLean, the haptic robot is being tested as a therapeutic tool to help comfort distressed individuals. Through biometric measurements (e.g. breathing and heart rate) and self-reporting, researchers are discovering a strong decrease in anxiety when the device “breathes” in sync with users’ breathing. Flagg recently presented the robot and findings at the TEI 2013 conference in Barcelona.

By making the haptic creature responsive to emotions, it may also provide a reassuring companion for children with anxiety disorders. Haptic devices also help people to better process background, ambient information. For children with attention issues, interaction with a haptic device while focused on an activity may improve their ability to multitask.

The power of touch

Affective (emotionally-focused) haptics is an emerging field. As research has shown, touch is deeply connected to emotion. Tactile data offer new, powerful inputs for user interfaces that can enable computers to learn about your emotional state, and respond to it accordingly.

“It’s about giving machines more emotional intelligence – that’s the vision we’ve had in our group for a long time,” said MacLean. As an engineering graduate student at MIT, she built one of the first haptic interfaces at the Institute’s Biomechanics Lab. Now she is developing a haptic language to provide a basis for physical communication with computers, and a platform for modeling emotional cues conveyed through touch.

“Right now we don’t have good, fast ways of determining human emotion, the best they can do is face recognition or detection – not always that practical. Haptics offer a new, way to model emotional states that may be more accessible in some circumstances,” said MacLean. “What if you had a smartphone with a skin that feels nice, and that breaths and vibrates when touched? From my fidgeting with it, we could figure out my emotional state, and then have it change how it behaves because it knows how I’m feeling.”

Touch-sensitive surfaces to be revolutionized

Current touch displays, such as flat multi-touch screens, only track information about the position of a user’s touch. But as MacLean points out, much more can be learned from tactile interactions. Coaxed by pleasing surfaces that elicit different touch gestures: “I’m going to tell [a computer] a lot more through my hands than if [the surface is] just sitting there flat or not that nice to touch,” said MacLean. With more data to work with, she believes computers will be able to respond to touch as a human would. “There’s a simple language of touch that we’ve been able to figure out, features that can be extracted from touch, which are quite common to all people.”

In the near future, haptic sensors may become standard in items such as clothing, mobile devices, or home appliances. Thanks to a breakthrough in reliable low-cost sensors invented at UC Berkeley and developed at UBC, SPIN researchers can gather haptic data from an endless variety of surfaces. “Suddenly you can have responsive surfaces that are quite richly responsive to touch – I think there are tremendous applications that we can’t even imagine right now,” said MacLean.

Affective touch is already crucial to the design of some commercial products, and the industry shows a growing appetite for innovative haptic technology. Businesses including car manufacturers, mobile phone companies and medical device designers have consulted the SPIN Lab for guidance on haptic design. Students there are responding with new design guidelines for haptic displays and controls, as well as tools to rapidly prototype the tactile features or “feels” of devices such as game controllers and car interiors.

GRAND is also helping MacLean and UBC graduate Diane Tam launch a start up for HaNS (Haptic Notification System): a wireless wrist-worn device that delivers gentle tactile alerts to aid speakers in giving well-timed presentations. The device may also be commercialized as a training tool for serious athletes.

“We need application domains to bring our [haptic research] to life. We need to be inspired by problems, we need people to bring us these challenges,” said MacLean. “When you say ‘new media’ we’re not just talking about people who are doing research in new media itself, we’re talking about who is using that media: clinicians, people in interactive design, and people who invent. Some of these collaborators I never would have met without GRAND.”

 


 

Contact: Spencer Rose
Communications Officer
GRAND NCE