Loading...

Researchers take a step towards real empathy in AI: affective computing

Affective computing, also known as artificial emotional intelligence or emotion AI, strives for the development of systems and devices that can recognise, interpret, process and simulate human emotion. It other words, the AI see your tears and understands you are sad and reacts appropriately.

Researchers at Dyad X Machina combine affective neuroscience, the psychological study of the neural mechanisms of emotion, with deep learning which is a highly structured version of machine learning and which are inspired by biological nervous systems. Their goal is to create an affective layer which guides us through the world and helps us with our decisions. When we make a decision, we not only look at logical pros and cons of outcomes but we also feel through possible futures to come to a decision. Haohan Wang, a co-founder Dyad X Machina, says their mission is to bring emotion into machine learning.

What is the implication of affective computing for future apps?

This is the trippy part: Future applications will adapt to the user’s emotional state. From the point of view of the app, we as users become a more complex persona that goes beyond our demographic metadata and browser history to the apps working for us. This is attained by using real time physical and magnetic data to determine our emotional state from such inputs as a fitbit which measure HRV (heart rate variability) which, with affective computing, can determine your emotional state. In a sense, it takes us back to the days of biofeedback but in this case the app is reading the data and making the appropriate adjustments and creating an incredibly personalised “in-the-moment” experience based on your emotion. If you are upset you’ll see one version and if relaxed, a different one.

 

This is one of the most complex projects which applies AI to emotion to date, but there are many companies who have become experts in reading and deciphering emotions externally by reading facial expressions and listening to voice patterns and many you can experiment with your own images.

 

 

Google Cloud Vision has a console where you upload a photo and it gives you instant analysis:

 

 

Affectiva Founded in 2009, this Massachusetts startup has raised nearly $20 million from investors that include Kleiner Perkins Caufield & Byers and Horizon Ventures. Affectiva has analyzed nearly 4 million faces across 75 different countries to build an program that in real-time can analyze any of the 10,000 possible facial expressions you could be displaying at any given time. Want to see how it works? They have an online demo you can try which will analyze your facial expressions through your webcam while you watch humorous Doritos adds. Or, you can download their Affdex app on your phone and read your face in real time:

 

MIT has created a package of programming tools for building models from behavioral data. These include data from devices as well as speech patterns.

Areas where affective computing is being used include market research, healthcare, automotive and of course media and advertising where MPC is using the technology to create content with the greatest emotional impact.

The motivation for this research is the ability to simulate empathy in machines interacting with humans. The machine should interpret the emotional state of humans and adapt its behavior to them, giving an appropriate response to those emotions.

And all this strives to replicate the interactions humans have between each other. But true benefits come with therapy and education, where the machine’s objective is to improve the human experience and to do that the empathy barrier needs to be broken and what we are seeing here are the first steps.

 

Dennis Neiman

Innovation MPC