U.K. Watchdog Issues First of Its Kind Warning Against ‘Immature’ Emotional Analysis Tech

U.K. Watchdog Issues First of Its Kind Warning Against ‘Immature’ Emotional Analysis Tech
Contributor: Mack DeGeurin

The head of the United Kingdom’s independent privacy watchdog worries highly hyped efforts to use AI to detect people’s emotional states simply may not work, not now, or possibly even ever.

In a first-of-its-kind notice, The Information Commissioner’s Office, Britain’s top privacy watchdog, issued a searing warning to companies against using so-called “emotional analysis” tech, arguing it’s still “immature” and that the risks associated with it far outweigh any potential benefits.

“Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever,“ ICO Deputy Commissioner Stephen Bonner wrote. “While there are opportunities present, the risks are currently greater. At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination.”

Emotion analysis, also known as emotion recognition, or affect recognition, follows similar principles to more well-known biometrics techniques like facial recognition but is arguably even less reliable. Emotional analysis or emotional recognition systems take scans of individuals’ facial expressions, voice tones, or other physical features, and then attempts to use those data points to infer mental states or predict how someone feels.

USC Annenberg Research Professor Kate Crawford details some of the inherent pitfalls of that approach in her 2021 book Atlas of AI.

“The difficulty in automating the connection between facial movements and basic emotional categories leads to the larger question of whether emotions can be adequately grouped into a small number of discrete categories at all,” Crawford writes. “There is the stubborn issue that facial expression attempts indicate little about our honest interior states, as anyone who has smiled without feeling truly happy can confirm.”

Bonner went on to say that “the only sustainable biometric deployments’’ are ones that are fully functional, accountable, and “backed by science.” Though the ICO has issued warnings about specific technologies in the past, including some falling under the category of biometrics, Bonner told The Guardian this week’s notice marks the first general warning against the ineffectiveness of an entire technology. In that article, Bonner went on to describe attempts to use biometrics to detect emotion as, “pseudoscientific.”

“Unfortunately, these technologies don’t seem to be backed by science,” Bonner told The Guardian.

And while the ICO post spends some time calling out potential threats inherent to biometrics tech through the use of facial recognition for ID verification or airport check-ins, the watchdog maintains emotional analysis is uniquely worrisome.

“The inability of algorithms which are not sufficiently developed to detect emotional cues, means there’s a risk of systemic bias, inaccuracy and even discrimination.” The ICO post read.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.