top of page
  • HCD Research

Clearing the Neuro-Hype Air: Addressing Emotion AI, Facial Coding, and Biometrics

The following blog is in response to the ICO’s statement, which can be read here.



Neuro-hype has plagued the world of applied consumer neuroscience since its inception. Enthusiasts can easily get carried away by the suggestion of a quick and easy way to get the answers they want. But the truth is that if businesses rely on the flashiness of neuroscience to bring in customers, over-promising and underdelivering is inevitable.


Recently, Information Commissioner’s Office (ICO) made a statement raising concerns about the improper use of biometric technology, facial coding, and Emotion AI. Neuro-hype is a precursor to the misuse of technology, so HCD wanted to revisit some of the important red flags when conducting this type of research.


Common Neuro-Hype Characteristics include:

  • Psychobabble

  • Relying on anecdotal evidence

  • Using unprovable, false claims

  • Putting company claims (through mediums such as case studies) against scientific facts

  • Having little to no peer review

HCD takes pride in producing quality research by leading with limitations, staying updated on the latest thought theories, and being honest if something sounds too good to be true.


HCD Resources:



Addressing Emotion AI.

Within the Information Commissioner’s Office post, “Emotion AI” is one of the components called out for lacking scientific validity and reliability. HCD Research will continue to try to demystify and call out inadequacies in using algorithms for emotions.


We know that emotions are a challenging space to research, which is proven by the sheer number of theories of emotions that exist. There is no consensus on what emotions actually are, therefore, it is going to be difficult (and sometimes even dangerous) to make claims based on one researcher’s interpretation of emotion.


To further the issue, using artificial intelligence to explain emotion can be problematic since it lacks context, a crucial piece of understanding an emotional experience. AI is the use of machinery to complete a task, which means that it will run what is it programmed to accomplish. However, bad data in is bad data out. That means that an algorithm programmed to detect emotions without context will quickly find patterns that misrepresent a person’s lived experience. The oversimplification and disregard for cultural differences when designing research about emotion can result in grave consequences and poor interpretations of data, furthering the concern for improper experimentation when studying emotions.


HCD Research strives to produce quality research that acknowledges and accounts for the nuances involved in studies exploring emotion.


HCD Resources:



Addressing Facial Coding.

Facial coding (FC) recently received backlash stemming from this article. HCD has and will continue to be vocal about the limitations of facial coding and, more largely, the concept of universal facial expressions.


From its high dropout rates to easily skewed results, facial coding is often oversold as a method for collecting emotional states. People do not emote the same way, and FC perpetuates stereotypical expressions of emotion. The biases and data discrepancies perpetuated by the misuse of this tool can be blamed on emotion artificial intelligence (AI). Emotions are variable and messy, making any real-world predictions about someone’s emotions based solely on facial configurations unreliable. The lack of consideration for cultural differences, context, and what “emotions” are when trying to categorize facial expressions through algorithms can lead to harmful and inaccurate claims about human emotion.


There can be situations where facial coding can provide value, but the research design must account for any tool’s limitations to ensure that other methodologies or technologies cover where it falls short.


At HCD, we strongly believe that many validated technologies exist that have advantages and disadvantages depending on the question being asked. By using the right tool for the right question, you can develop a strong research design and build out actionable insights.


HCD Resources:



Addressing Biometrics.

Finally, the last component of the ICO’s a statement important for HCD to address focuses on the misuse of biometric technology in research exploring emotions. HCD wants to discuss some of the concerns regarding biometrics; however, it is important to start with defining terms.


Any output from the body, from fingerprints to signatures, can be considered a biometric measure. Therefore, the vagueness of biometrics creates a problem because not all tools or outputs are used the same way. As researchers, we have to define the terms and conditions upfront to avoid confusion.


The measures we collect and interpret are better referred to as psychophysiological tools because it is the analysis of the body and mental processes. Physiological measurement of emotional response has a long and rigorously studied history validating the correlations between bodily and emotional response. Psychophysiological measures, such as galvanic skin response (GSR) or facial electromyography (fEMG), do exactly what they are supposed to do. However, each tool has limitations, and we must be sure to address those gaps in the research design.


In short: psychophysiological tools should not be used in a vacuum because there is not one tool that can provide all the answers.


There is also no consensus in the literature on a universal theory of emotion; therefore, one psychophysiological tool alone cannot confidently address an entire emotional experience. Understanding the human experience involves using the best tools for a specific situation. Collaborating with experts to provide guidance on a wide variety of tools can help direct which tool, or combination of tools, will be best for each specific research question.


HCD Resources:


If you have any questions about the topics covered in this blog post, please contact Allison Gutkowski (Allison.Gutkowski@hcdi.net)