
CASE STUDIES
Read Hidden Emotions Like Never Before.
Project M uses advanced CNN-based AI to detect micro-expressions and analyze 8 primary emotions across multiple intensity levels—covering over 400,000 unique emotional variables. Trained on 500,000+ accurately labeled data points, it achieves over 95% ROC/AUC accuracy and outperforms existing emotion AI platforms, including Microsoft Azure.
CASE STUDIES
Project M: AI That Reads Hidden Emotions with 95% Accuracy
IPMD’s advanced Emotional AI platform, Project M, is designed to detect human emotions based on hidden and micro facial expressions. This software utilizes ML solutions with custom-built CNN-based algorithms to detect human emotions. Categorizing emotions into eight primary emotions (anger, contempt, disgust, fear, happy, neutral, sad, and surprise), and sub-categorizing these emotions into at least four different intensities totals approximately 400,000 different variable emotions, IPMD has created over 500,000 accurately labeled train and test data and reached over 95% overall ROC/AUC scores. These case studies demonstrate how Project M outperforms current emotional AI platform technology in accurately detecting and analyzing complex human emotions. The sample input data we used is the pure testing data that is completely new to M and Microsoft (assumed) and is never been used to train M, so this comparison is a fair trial between M and Microsoft. We appreciate Microsoft, the leader of the industry and emotional AI sector, for allowing the general public to test their testing site for identifying human emotions based on facial expressions.



















