top of page
  • Writer's pictureIPMD webmaster

M vs Microsoft: Episode 22

Does an emotional AI have ability to identify fear?


What is happening here?

Adrienne, the woman in this image, is trapped in an abusive relationship with her husband. In this scene, she finds out her husband is a murder, and is trying to sneak out of her house in the middle of the night, her husband fast asleep, in an attempt to find evidence of her husband’s crime.


She wants to escape her abusive marriage and make sure her husband receives punishment for his crimes. However, suddenly, their maid, who is loyal to her husband, walks into the room Adrienne is in. Adrienne hides so as to not get caught, and is afraid of the consequences she’ll face (violence, anger, physical abuse) from her husband if she is found.


Let’s take a moment to think of what kind of emotions Adrienne could be feeling?

Since this image is captured when Adrienne is trying to hide from the maid (and thus from being discovered by her husband), her most dominant emotion may be fear - both of getting caught and of her husband finding out what she was trying to do.


She likely wasn’t expecting the maid to come into the room, or she wouldn’t have had to hide, so she is also probably surprised. She may feel a small amount of anger and the loyal maid walking in in the middle of her escape, as well as at her husband’s actions.



Now Let’s see how M and Microsoft interpret the emotions that Adrienne is feeling in the moment.

M detects a high level of fear (48%), which makes sense given Adrienne’s overwhelming response is to do with what will happen to her if her husband finds out she was disobeying him. M also correctly detects surprise (24%), and a small level of anger (7%). All in all, M is able to interpret an incredibly complex scenario in terms of emotional experience, and provide an accurate read of Adrienne’s emotions.


On the other hand, Microsoft Azure identifies Neutral as Primary emotion (61%), which we know cannot be the case in such an emotionally charged situation. It does detect surprise as the correct secondary emotion (27%), but incorrectly identifies fear as the tertiary emotion at only for 12%.


Now that you know the context behind the situation, and the two different emotion readings, it is up to you to decide what interpretation you think is most accurate.




 

We at Project M believe the ability of emotional artificial intelligence platforms to identify and understand negative emotions has great potential to optimize humans’ emotional well being.


Follow us on our social media and stay tuned for our next episode!


Your LIKE means a lot to us! Follow Project M on:


Company website: http://www.ipmdinc.com

Medium: Project M

Facebook: Project M

Blogger: Project M

Instagram: mprojectai

Twitter: @mprojectai


*As of March 1, 2020, the Project M team has devoted 80,000 hours to the AI platform, M. The sample input data we are using is the pure testing data that is completely new to M and Microsoft (assumed) and has never been used to train M, so this comparison is a fair trial between M and Microsoft. We appreciate Microsoft, the leader of the industry and emotional AI sector, for allowing the general public to test their testing site for identifying human emotions based on facial expressions.

Recent Posts

See All
bottom of page