AI “emotion detection”

AUTHENTICITY // HUMANLIKE AI

False and overblown AI claims, I found a nice one: “A Chinese AI emotion-recognition system can monitor facial features to track how they feel. Don’t even think about faking a smile. The system is able to analyze if the emotion displayed is genuine.”

  • These headlines keep popping op. However, scientific studies have proved that emotion-recognition systems are built on pseudoscience. You cannot infer how someone feels from a set of facial movements. Facial expressions are –not– linked to inner emotional states and do –not– reliably correspond to emotions.
  • Although Business Insider mentions this and interviews a social scientist at the end of the article, their headline and abstract suggests something else. AI-companies can make these false claims, not only because of uncritical clickbait headlines, but also because the buyers of these systems are often not that well educated in social science.
  • The AI-hype outstrips the reality, as the self fulfilling prophecy effects of these emotion detection systems are already out there. For example in online application systems that lead to self-censorship and policing your own facial expressions to game the system.
  • If AI emotion detection systems work, it’s not because AI has become ”better” at recognizing emotions, it’s because we start to behave like simple machines with exaggerated facial expressions that can easily be read by machines.