top of page
  • Writer's pictureIPMD webmaster

M vs Microsoft: Episode 6

Can M and Microsoft Azure still be able to recognize human faces and read their emotions if the person turns his/her head a little bit?

Being able to recognize human face automatically via Computer Vision is crucial for the accuracy of AI emotion recognition technologies in actual application. The tests in the previous M vs. Microsoft episodes use pictures of the upright human face, and were successfully detected by both computer visions algorithms attached to the AI emotion recognition platforms.

Here's the result:

While Microsoft Azure fails to detect any human face from the picture, M not only successfully identifies the face in the picture but also reads 99.7% happiness from it.



We, at Project M, believe the ability of emotional artificial intelligence platform to identify and understand obvious negative emotions has great potential to optimize humans’ emotional well-being.

Follow us on our social media and look out for our next episode!

Your LIKE means a lot to us! Follow Project M on:

Medium: Project M

Facebook: Project M

Blogger: Project M

Instagram: mprojectai

Twitter: @mprojectai

LinkedIn: Project M

*As of March 2019, the Project M team has devoted 58,000 hours to the AI platform, M. The sample input data we are using is the pure testing data that is completely new to M and Microsoft (assumed) and is never been used to train M, so this comparison is a fair trial between M and Microsoft. We appreciate Microsoft, the leader of the industry and emotional AI sector, for allowing the general public to test their testing site for identifying human emotions based on facial expressions.

Recent Posts

See All


bottom of page