top of page
  • Writer's pictureRimma Ten

M vs Microsoft Episode 28

Updated: Dec 8, 2022

Can Emotional AI Identify Obvious Negative Emotions with Higher Intensities?

In this episode, we will be discussing Project M and Microsoft Azure’s screening results for the obvious negative emotion such as anger with higher intensities such as furious and mad.

What’s happening in this image?

This is special agent Peterson in the picture. As you can see he is pretty mad and at this moment he is reprimanding his agent for sabotaging the important mission he was trusted with and was expected to complete it and deliver the results immediately, which he failed to do so.

Now let's see what M and Microsoft Azure identify:

M correctly identifies Peterson’s emotional state as Angry at 74% as the primary emotion, Disgust at 15% as the secondary emotion (in the pie chart), and Negative mood at 98%. When you reprimand a person for ruining the crucial assignment that was given to them, you will experience anger as the primary emotion and secondary emotions such as Disgust will follow because Peterson also feels disgusted towards his employee and the ruined mission. All these negative emotions will lead to a negative mood as M correctly identifies here.

Now, let’s test Microsoft Azure using the same image:

Microsoft Azure incorrectly identifies his primary emotion as Happy at 60% and correctly identifies Anger as the secondary emotion but only at 35%.

So why does Azure fail to identify the obvious negative emotion?

The reason is that their technology reads the facial muscle movement especially around his mouth — it mistakenly thinks that this man is laughing as he is displaying somewhat similar facial muscle movement when you laugh.

The 35% Anger that Microsoft Azure identified correctly here is because of the facial expression he is displaying such as furrowed eyebrows which can indicate anger here.

Now that you know the context behind the situation and the two different emotional readings, it is up to you to decide what interpretation you think is most accurate.


We, at Project M, believe the ability of emotional artificial intelligence platform to identify and understand obvious negative emotions has great potential to optimize humans’ emotional well-being.

Your LIKE means a lot to us! Follow Project M on:

Project M: IPMD website: Medium: Project M Facebook: Project M Blogger: Project M Instagram: mprojectai Twitter: @mprojectai LinkedIn: Project M

*As of March 1, 2021, the Project M team has devoted 104,000 hours to the AI platform, M. The sample input data we are using is the pure testing data that is completely new to M and Microsoft (assumed) and has never been used to train M, so this comparison is a fair trial between M and Microsoft. We appreciate Microsoft, the leader of the industry and emotional AI sector, for allowing the general public to test their testing site for identifying human emotions based on facial expressions. 

Recent Posts

See All


bottom of page