Two Stanford Students Use Google Glass to Help People with ASD Recognize Emotion, a Report Reveals

A number of children who struggle with autism aren’t able to understand emotion. For example, if you smile at them, they’re not able to understand what that means.

In fact, a number of children dealing with the autism spectrum disorder (ASD) are not able to do this. So, a student from Stanford University decided to take up the challenge in trying to help autistic people using Google’s latest technology, Google Glass.

The student, Catalin Voss, is working on a project called Sension with Jonathan Yan so as to help people with ASD to focus on a person’s face while having Google Glass determine the emotion of the person using the device’s webcam.

While this procedure is still in the experimental phase, it isn’t a first at all. Doctors test a baby’s eyes to see how well it responds and whether it can comfortably focus on a person’s regions as comfortably as possible.

So, an autistic person won’t have to struggle to read their conversation partner’s face for cues but read the heads up display (HUD) which they’ll focus on the person’s face and will instantly see the words Happy, Surprised or Upset.

The reason why this project despite its limitations matters can be summed up in the words of Derek Ott, a professor at the David Geffen School of Medicine, who said, “Anything that can be used to facilitate social understanding in people with autism is potentially beneficial.”

The only caveat to this project is the obvious inability for Google Glass to help the person with ASD to respond appropriately to these emotions.