MIT researchers are working on a wearable AI system that could help to guide people with autism spectrum disorder (formerly known as Asperger’s disorder) and social anxiety through everyday conversations, reports MIT News.
The aim is to develop a system that could advise users of the emotional tone of a given conversation, all from the interface of a smartwatch. Currently based on a Samsung Simband device, the AI system was trained on 31 recorded conversations, and is designed to assess whether conversations are happy or sad based on speech patterns and biometrics like heart rate, blood pressure, and hand movement.
In its current form, the system can perform with an accuracy 18 percent higher than chance. But the researchers are aiming to further refine it so that it’s not only more accurate, but able to offer more nuanced analysis so as to tell a user not just whether a conversation is happy or sad, but if its participants are bored, excited, and so on. They also want to develop a version that can run on commercial devices like the Apple Watch.
It’s the latest example of how subtle biometrics can offer insights into the intangible emotional experience, with researchers at NYU’s Langone Medical Center having recently announced research exploring how voice biometrics can be used to diagnose PTSD and depression. In the case of this MIT team’s tone analysis technology, its main application will be in helping to offer coaching to individuals who have difficulty with social interactions, with biometrics acting as a virtual crutch or cast as the user improves.
Source: MIT News
(Originally posted on Mobile ID World)