top of page
  • Writer's pictureRimma Ten

M vs Microsoft: Episode 26

Can AI Detect Enjoyment From Facial Looks?


What’s happening in this image?


Rose, a woman in the picture, is a shy, plain, middle-aged English literature professor at Columbia University, is having dinner with her mother.


Even though she has troubles in her life, for instance, she shares a home with her vain and overbearing mother, and when her beautiful sister got married to a handsome man, and when Rose is still single, she has reached the point where she knows that she will never get married.





Despite all the troubles, right at this moment, Rose is eating the food that she loves — steak with mashed potatoes and vegetables — that makes her forget about her troubles and constant worries about her future.


Rose does, however, have a particular way of eating it — she cuts each piece of the food on the plate and eats each piece of every food at once. She likes the mixed flavors of her favorite food that come together which makes her enjoy it even more.



Now let’s see what M and Microsoft Azure identify:


M identifies her emotional state as Happy at 82.4% as the primary emotion and Contempt at 11.8% as the secondary emotion. M also identifies her mood as Positive at 97%.


When we enjoy something that we like as Rose does (enjoying her favorite food), endorphins and dopamines are released that make us feel happy which is what Rose’s emotional state is at this moment. When endorphins are released in the body, they may help reduce stress and anxiety.


The secondary emotion, Contempt that M identifies here, is that despite all the troubles, she feels confident that she will be all right. In fact, positive feelings make you feel confident and optimistic, thus giving your self-esteem a boost. This is exactly what M identifies here.



Now, let’s test Microsoft Azure using the same image:

Microsoft correctly identifies Rose’s primary emotion as Happy but only at 0.57% and Neutral as the secondary at 0.40%, which means that she is not experiencing any emotions at this moment and that is not correct. It could be to the fact that her upper part of the face doesn’t exhibit much facial muscle movement which is why Azure incorrectly identifies Neutral as the secondary emotion.


Now that you know the context behind the situation and the two different emotional readings, it is up to you to decide what interpretation you think is most accurate.




 

We at Project M believe the ability of emotional artificial intelligence platforms to identify and understand positive emotions has great potential to optimize humans’ emotional well being.

Follow us on our social media and look out for our next episode!

Your LIKE means a lot to us! Follow Project M on:

Medium: Project M

Facebook: Project M

Blogger: Project M

Instagram: mprojectai

Twitter: @mprojectai

LinkedIn: Project M



*As of January 1, 2021, the Project M team has devoted 100,000 hours to the AI platform, M. The sample input data we are using is the pure testing data that is completely new to M and Microsoft (assumed) and has never been used to train M, so this comparison is a fair trial between M and Microsoft. We appreciate Microsoft, the leader of the industry and emotional AI sector, for allowing the general public to test their testing site for identifying human emotions based on facial expressions.

Recent Posts

See All
bottom of page